A Natural Language Interface for Directing Virtual Character Performances

Project partners: Cardiff University, Sugar Creative

XR Network+ is supporting a collaboration between researchers at Cardiff University and creative innovation studio, Sugar Creative to develop a natural language interface for directing virtual character performances. 

The system will interpret the director’s spoken or written instructions regarding desired emotions, facial expressions and lip movements, and map this input in real-time to control the facial animations of a virtual character.

The research and development undertaken as part of this project will bridge the gap between virtual and traditional production techniques for directors accustomed to working with live action. The interface will democratise the virtual production process by enabling directors with ambitious creative ideas to engage in projects without needing a full team of animators or live actors. 

The wider project team includes experts from Bournemouth University, Buckinghamshire New University and Media Cymru Innovation Space. The team anticipates that insights and systems developed within the project will be applicable to XR (extended reality) educational content, XR story-led content, XR games and XR geolocational content. 

The outputs of the project will also be harnessed in the evolution of a number of current and future XR piece that are being undertaken in partnership with The Chemical Brothers to explore the ways in which music and data from live shows, and XR, can be hybridised to create new forms of experiential content. 

This collaboration is one of seven projects supported by the XR Network+ Embedded R&D (round two) funding call, with grants of up to £60,000 awarded to researchers at UK universities to explore the transfer of knowledge between academia and industry in areas aligned with Virtual Production. 

Image credit to Yipeng Qin.

 

Categories: Film, Games, Performance, Research, Technology, TV