XR-Sync (Extended Reality Stage Synchronization)
Project partners: University of Greenwich, Target3D
A project led by the University of Greenwich is developing new workflows to blend virtual sets with physical stages in real time.
Convergent media and virtual production require precise alignment between physical studio elements and digital 3D environments, which presents spatial challenges for set assembly and actor positioning.
In collaboration with Target3D, the project will leverage the Shared Hub for Immersive Future Technologies (SHIFT) facilities at the University of Greenwich and Creative Futures Centres support, to develop and test adaptable extended reality (XR) workflow solutions for different production scales across a range of accessible systems.
The research will connect a Virtual Reality (VR)/XR headset with nDisplay / Switchboard and studio coordinate systems (Mo-Sys, HTC VIVE MARS & Opti Track) to synchronise the headset view with the stage’s real-world tracking data.
By synchronizing the headset view with the stage’s real-world tracking data, users can step onto the stage and see the 3D environment precisely overlaid in their headset with accurate scale and alignment.
This immediate, interactive visual feedback will enable creative teams to build, arrange, and refine set pieces and actor positions in real-time, reducing errors and guesswork.
By bridging digital and physical workflows, streamlining set assembly and enhancing creative choices the prototype will improve efficiency and strengthen collaboration between filmmakers, designers and technical staff working at the convergence of physical and virtual worlds.
The project is one of eight collaborations that received R&D funding from XR Network+ through the XR Labs Fund. The funding call awarded grants of up to £25,000 for university-led collaborations to develop extended reality prototypes using facilities at UK universities.
Photograph by Sakina Beladioui.
Categories: Performance, Research, Technology
