Space Synth: A spatial instrument suite for live performance

Project partners: University of York, Leeds Beckett University, Gazelle Twin, Dan Conway, Kola Ganikale

XR Network+ is supporting a project led by researcher Ben Eyes at the University of York’s School of Arts and Creative Technologies that is creating a suite of tools that can be combined for live performance in 3D spatial audio spaces and venues.

Current spatial audio tools for musicians tend to be designed for postproduction workflows using automation of parameters such as panning and reverbs to create spatial audio effects in a Digital Audio Workstation such as Pro-Tools or Ableton. 

These tools tend to be non-real-time and require complex automation that must then be rendered out which slows down creative flow and introduces a layer of complexity to live performance. 

Work has begun in the field to tackle these challenges (e.g. products such as Sound Particles’ Sky Dust which claims to be the first 3D synthesiser). However, the workflow is complex and does not lend itself to real time performance or the reliability and stability of the kind of tech that requires live control.

The Space Synth project is creating a suite of tools for sound creation and manipulation for live performance for spatial audio. The suite will allow musicians, sound designers and composers to combine performance data, synthesis, sampling and spatial effects to create compelling spatially exciting live performances. 

Space Synth will use open source code so that the resulting outputs are accessible to XR creatives via a ‘pay what you feel’ model.

The collaboration will draw on expertise and input from electronic musicians Elizabeth Bernholtz (Gazelle Twin), Kola Ganikale (Performer and PhD researcher University of York), Dan Conway (Video Artist).

The team will explore the features and workflow that performing artists would want for live performance and spatial composition. Their work will investigate how gestures made by performing musicians can be captured and converted into thrilling live immersive sound experiences.

The project is one of eight collaborations that received R&D funding from XR Network+ through the XR Labs Fund. The funding call awarded grants of up to £25,000 for university-led collaborations to develop extended reality prototypes using facilities at UK universities.

Image by Ben Eyes

Categories: Performance, Research, Technology