Booking info

Cost: Free

Book

Online seminar: Natural language processing and XR

3:00 pm - 21 Jan 2026

Online

Learn about two R&D projects that are developing natural language processing tools for animators and directors working in Virtual Production.

Zhuoling Jiang (Cardiff University) will present on a collaboration between Cardiff University and Megaverse that has resulted in a prototype of a natural language interface that gives virtual characters the ability to interpret and incorporate on-set direction into their performance. The prototype system interprets the director’s spoken or written instructions to these virtual characters regarding desired emotions, facial expressions and lip movements, and then maps this input in real-time to adjust the character’s facial animations, reflecting the direction.

Theodore Koterwas (University of Edinburgh) will then present on the StyleCap project – a collaboration between the University of Edinburgh and Retinize. This project developed a proof of concept machine learning model that styles the movements of virtual characters in real-time using natural language prompts. The model creates full body animation from three datapoints and a style label, laying a foundation for animators to say, for example, “walk with attitude” and have the animation reflect this instruction in realtime, speeding up the animation process.

This webinar will be hosted by Melissa Terras (University of Edinburgh, XR Network+ Co-Investigator) and following the presentations, Melissa will chat to the presenters about their work. The webinar will also feature an update on current activity from the wider XR Network+ project, including what is coming up in 2026.

Booking info

Cost: Free

Book