Sound Design pipeline for cross-platform 360 virtual productions

Project partners: Edinburgh Napier University, VRTONUNG

A collaboration between Edinburgh Napier University and immersive audio experts, VRToung, is developing a pipeline for sound design in 360 virtual productions.

Sound design plays an important role in heightening the sense of immersion in 360 immersive experiences. Despite this, the use of spatial audio in virtual production presents challenges for companies and creatives working in the field. 

A major issue relates to the cropping of 360 videos which occurs to resize videos for different platforms, such as TV or mobile, or to minimise dizziness in 360 VR experiences. 

Once a video is cropped, different sounds (which before were visually represented in the scene) are no longer visible. Whilst offscreen sounds are necessary to guide attention, provide acoustic context and sustain interest in a scene, in 360 they can quickly become a distraction.

Software such as the Insta360 Studio enables cropping of 360 videos to match the resolution of the preferred device, but there is currently no automated solution or established pipeline to facilitate the cross-platform spatial sound design which reflects the cropping of the video. 

The project will explore and prototype a first step towards defining an industry pipeline for spatial audio production in 360 video across multiple platforms. The research team will also investigate how the use of different sound design techniques, the camera movement and different video and sound formats impact on feelings of dizziness. 

The collaboration is one of seven initiatives supported by the XR Network+ Prototyping, Impact and Acceleration (PIA) funding call, with grants of up to £10,000 awarded to researchers at UK universities to develop new ideas and complete existing research related to VP. 

 

Categories: Arts, Film, Games, Research, Technology, TV