Remote Collaborative Networking Tools for Virtual Dance Creation: AWDC’s Otmo and Goldsmiths Mocap Streamer

Project partners: Goldsmiths University of London, Alexander Whitley Dance Company

XR Network+ has helped a collaboration between Dr Daniel Strutt at Goldsmiths, University of London, and Alexander Whitley Dance Company (AWDC) to firstly deliver a prototype, and then take the project to delivering a suite of professional performance tools.

In 2023-24, Dr Daniel Strutt and AWDC received Prototyping, Impact and Acceleration (PIA) funding from XR Network+ to integrate Goldsmiths Mocap Streamer (GMS) with AWDC’s ‘Otmo Live’ software – an application that allows users to position, modify, sequence, and share 3D human movement in a virtual space.

This integration created a virtual production tool capable of receiving live dance inputs from anywhere in the world, enabling remote responsive collaboration between dancer-choreographers. The successful prototype was showcased in March 2024.

This latest phase of work set out to build on earlier developments by refining and expanding remote motion-capture performance tools through Otmo Live as an industry-specific virtual production tool for theatrical performance. The team focused on enabling real-time collaboration, increasing system compatibility, and enhancing usability for performers and creators. These aims culminated in high-profile showcases at Beyond Conference, Digital Body Festival (2024), Jacob’s Pillow Festival in the USA (2025), and at SxSW London Fashion District opening ceremony.

A major achievement of the latest work was developing system interoperability. The team created a generalised streaming framework that allows for integration of different mocap systems. This makes remote, professional-quality performance more accessible and less reliant on high-level technical expertise.

Additional innovations included tools for multi-platform motion capture imports, live capture, recording, and playback in the Otmo desktop app, and a new framework for syncing audio and video streams perfectly with motion capture data. All of this has been achieved with minimal latency even across global distances making the system robust and scalable.

This collaboration is one of seven projects supported by the Embedded R&D (round two) funding call, with grants of up to £60,000 awarded to researchers at UK universities to explore the transfer of knowledge between academia and industry in areas aligned with Virtual Production. The projects took place over a six month period, commencing from September 2024.

Photo by Cherylynn Tsushima; courtesy of Jacob’s Pillow.

Categories: Performance, Research, Technology