The School of Digital Arts is a purpose built, interdisciplinary school at one of the UK's leading universities. Offering industry and research informed courses and specialist spaces with the latest technologies. The School of Digital Arts is a proud part of Manchester Metropolitan University. We build on the creative, science, tech and business strengths of a university whose research is rated as ‘world-leading' and is changing the way we live, work, learn and play.
AI systems are increasingly able to detect a speaker's emotions, leading to a new affective channel that can be explored in art. The controls available in standard Virtual Reality (VR) can be supplemented with speech recognition, natural language processing, and sentiment analysis. We aim to embody this potential in the front end of the Emote VR Voicer interface, which would translate detected meanings of vocal utterances to the morphing of abstract 3D animated shapes, enabling a radical new aesthetic experience. We are using iterative design cycles and ultimately aim to develop an interface that will improve the participant's wellbeing. Candidates do not need prior arts-sector experience, only openness to interdisciplinary collaboration.
Working within the School of Digital Arts (SODA) you will join state-of-the-art research on the AHRC funded Emote VR Voicer project, to develop a new, intelligently responsive VR app that incorporates speech recognition and meaning classification. You will help create a system that can detect live emotional content in the spoken (or sung) word, mapping this to visual animations based on sample banks of specially created 3D emotion shapes. You will be responsible for the VR development part.
About the role:
You will be working closely within a small project team consisting of artists, a psychologist and AI researchers in an iterative development cycle. You will use your programming skills and Unity experience to integrate AI models that detect and tag emotional meaning from audio and map these to steer real-time visuals in Unity. Live audio featured will also be mapped to animate graphics. Working closely together with the project lead, you will bring together assets to create animation blend trees and combine these with procedural animation to create a system where the shapes are animated differently depending on which emotion the system detects. Image synthesis, procedural content generation and style transfer will further expand on a bank of 3D graphics that are created specifically for this project. You will also be involved in some of the evaluation work and in writing up the research for publication(s).
The job will be for 2.5 days per week (0.5 FT) on a fixed-term basis for 8 months. The working pattern will be mostly on-campus with some remote working possible depending on project stage.
About you:
Key skills:
- A good understanding of programming within the Unity games engine using C# and experience with VR application development.
Essential skills and experience:
- A PhD in computer science, software engineering or a similar technical field, or equivalent professional experience
- Experience developing projects with C#
- Hands-on experience developing Metaverse/VR applications using Unity
- Proficiency with scripting for procedural animation generation
- Experience with writing and co-writing research papers
- Experience with image and/or audio-based projects
- Experience with real-time system optimisation (e.g. low-latency audio/visual feedback in VR)
- Experience with data backup systems
- Experience with working in interdisciplinary teams
- Excellent communication and interpersonal skills.
- Creative problem-solving skills
- Self-motivation and able to undertake independent research related to the brief
- Excellent ability to work to deadlines
Desirable:
- Experience with user testing or co-design methods, in arts or health settings
- Knowledge of the peer review process for research projects and journal articles
- Experience with bringing Python models into Unity
- Proficiency with Autodesk Maya modelling, skinning and rigging
- Familiarity with research projects productivity timelines
- Sensitive to nuances in visual aesthetics
To apply, please submit your CV, a cover letter explaining how you meet the criteria and include a link to previous relevant work, and two named references via our application portal. If you would like to discuss the role, please email Adinda at: A.vant.Klooster@mmu.ac.uk
Manchester Metropolitan University fosters an inclusive culture of belonging that promotes equity and celebrates diversity. We value a diverse workforce for the innovation and diversity of thought it brings and welcome applications from all local and international communities, including Black, Asian, and Minority Ethnic backgrounds, disabled people, and LGBTQ+ individuals.
We support a range of flexible working arrangements, including hybrid and tailored schedules, which can be discussed with your line manager. If you require reasonable adjustments during the recruitment process or in your role, please let us know so we can provide appropriate support.
Our commitment to inclusivity includes mentoring programmes, accessibility resources, and professional development opportunities to empower and support underrepresented groups.
Manchester Met is a Disability Confident Leader and, under this scheme, aims to offer an interview to disabled people who apply for the role and meet the essential criteria as listed in the attached Job Description for that vacancy.
Details
- Location:Manchester All Saints Campus
- Faculty / Function:Arts & Humanities
- Salary:Grade 7 (£35,608) pro rata
- Closing Date:19 March 2026
- Contract Type:Fixed Term
- Contract Length:8 months
- Contracted Hours per week:17.5