Exploring Subtitle Data for Interactive Experiences in Virtual Production
Project partners: Bournemouth University, University of Dundee, ITV
XR Network+ has supported partners at Bournemouth University and the University of Dundee in collaboration with ITV to develop three new, working prototype tools to enhance and explore subtitle use in interactive experiences for Virtual Production (VP).
The tools have been designed to repurpose subtitle data for media segmentation, transformation, and interaction. Subtitle files – which represent the text version of spoken words (and sometimes sounds) in video or film content – are typically used for accessibility, translation, or clarity and have often been seen as having limited further use. This project, however, demonstrated their significant creative and editorial potential.
Whilst the original objective was to improve an existing prototype, through an iterative, agile process and close collaboration with ITV, the team developed a suite of three working prototypes exploring different aspects of the VP workflow. Each tool addresses a distinct use case – ranging from media segmentation to accessibility and stylised content generation – while drawing from the same underlying subtitle data.
The first tool facilitates subtitle-based segmentation of video, supporting speaker-, chapter-, and time-based filtering. This enables the rapid creation of themed or accessible edits. The second tool generates interactive, print-style outputs (e.g. graphic novel-style summaries) based on subtitle and video data, enhancing both accessibility and media literacy. The final tool is a prototype that explores subtitle tone, sentiment, and character dynamics to generate stylised media outputs. For example, using subtitle cues to reflect anger, intensity, or humour through visual styles, or applying character-specific looks to distinguish different speakers.
All three tools were evaluated through participant interviews. Feedback confirmed that they were intuitive, novel, and highly relevant to both industry and educational contexts. These insights informed the final refinements and highlighted their potential for broader application.
Future research with ITV will explore how structured media metadata – including subtitles, scripts, and audio descriptions – can be repurposed to drive automation, accessibility, and audience engagement across VP workflows.
This project is one of four initiatives supported by the XR Network+ Prototyping, Impact and Acceleration (PIA) round two funding call. Grants of up to £10,000 were awarded to researchers at UK universities to develop new ideas and complete existing research related to Virtual Production. The projects took place over a six month period, commencing from September 2024.
Image credit to Benjamin Gorman.
Categories: Film, Research, Technology, TV
