Biometric Bodies Breathe: Exploring Heart Rate Synchrony and Embodied Feedback Loops Using Real-Time Data Capture in Immersive XR Virtual Production
Project partners: University of the West of England, University of Bristol, Daniel Bacchus, Joana Penso
A collaboration led by the University of the West of England (UWE) is developing a biometrically-responsive extended reality (XR) prototype that integrates real-time heart rate data, audio capture, and motion tracking to generate responsive avatars in immersive environments.
The project brings together multi-disciplinary research expertise from UWE and the University of Bristol that encompasses immersive arts, biometric sensing, embodied performance and neuropsychology. Artists Daniel Bacchus and Joana Penso are providing expertise in real-time, data-driven digital artworks and creative sound design.
The project team will utilise state-of-the-art facilities at The Bridge Studios (UWE) and at the Bristol Centre for Digital Futures (University of Bristol and MyWorld).
Wearable heart rate monitors and stethoscope microphones will capture live biometric and audio data from participants’ bodies which will be translated into dynamic visual, spatial, and sonic outputs within the immersive XR environment.
Motion capture data will be layered with biometric inputs in real-time, to produce embodied, data-driven avatars. Participants’ heart rate data (e.g. beats, rhythms, synchronicities, etc.) will directly influence the form, behaviour, feedback and synchronicity of avatars within the XR environments.
As part of the latter stages of prototype development, the project team will explore heart rate synchronicity between participants within multi-user participatory XR scenarios to generate shared feedback loops that manifest in the XR space (e.g., avatars pulsing in unison, environments shifting rhythmically, avatars directly affecting responsive environments, etc.).
Through user-testing and data-gathering, the collaboration will explore how physiological heart-rate synchronisation can foster affective connection across distributed physical and virtual immersive sites. The project has wide-reaching implications for socially-connected immersive experiences, collaborative XR performance, and therapeutic/wellbeing interventions.
The R&D expands on early-stage prototypes developed through collaborative Digital Culture Research Centre (UWE) x Immersive Arts Fellowship research that explored how XR technologies could be utilised to shape visually, spatially and sensorially responsive data representations/avatars of biometric bodies in immersive XR contexts.
The project is one of eight collaborations awarded R&D funding from XR Network+ through the XR Labs Fund. The funding call awarded grants of up to £25,000 for university-led collaborations to develop extended reality prototypes using facilities at UK universities.
Photo by Danny Bacchus at the Bridge Studio, UWE.
Categories: Performance, Research, Technology
