A Natural Language Interface for Directing Virtual Character Performances

Project partners: Cardiff University, Megaverse

XR Network+ has supported a collaboration between researchers at Cardiff University and involving experts from Bournemouth University, Buckinghamshire New University and Media Cymru Innovation Space alongside industry partners from Megaverse. Together, they’ve developed a fully functional natural language interface for directing virtual character performances. 

The prototype system interprets the director’s spoken or written instructions regarding desired emotions, facial expressions and lip movements, and maps this input in real-time to control the facial animations of a virtual character. This is significant because it allows creators to bridge the gap between virtual and traditional production techniques. The interface democratises the virtual production process by enabling directors with ambitious creative ideas to engage in projects without needing a full team of animators or live actors.

To bring this prototype to life, the team integrated several advanced technologies. They employed OpenAI’s natural language processing capabilities and taught this to interpret spoken or written directions regarding different facial expressions. This allowed the system to understand high-level instructions like “show a disappointed face” or “raise your eyebrows a bit higher”.

Integrating several different technologies into the final prototype successfully demonstrated the feasibility of using natural language to drive a direction interface for virtual characters and has proven the accessibility and usefulness of this in realtime situations. The success of this project and collaboration with industry partners, Megaverse, has cemented the foundation for future work and expansion of the project to include alignment with emerging trends in AI driven virtual production. 

This collaboration is one of seven projects supported by the Embedded R&D (round two) funding call, with grants of up to £60,000 awarded to researchers at UK universities to explore the transfer of knowledge between academia and industry in areas aligned with Virtual Production. The projects took place over a six month period, commencing from September 2024.

Image credit to Yipeng Qin.

 

Categories: Film, Games, Performance, Research, Technology, TV