Development of a High-Fidelity Earthquake Simulation Environment for Virtual Production Based on Unreal Engine

Project partners: Royal College of Art, UNSW Sydney, Foster + Partners

XR Network+ has successfully supported a collaboration between researchers at the Royal College of Art (Principal Investigator: Dr Ali Asadipour), UNSW Sydney, and London-based architecture studio Foster + Partners, to design and develop an advanced digital twin and real-time earthquake simulation platform. 

Earthquakes pose a serious threat to human life and infrastructure. Simulations are vital for improving preparedness, emergency response and resilient design, but current methods are often resource heavy and not interactive enough. In the past decade, major quakes in Nepal, Ecuador, Indonesia, Haiti, Türkiye, Syria and most recently the 8.8 magnitude earthquake off Kamchatka, Russia in 2025, which despite causing infrastructure damage and injuries resulted in no deaths thanks to strong building standards and alert systems, have caused over 70,000 deaths and more than £65 billion in damage. More effective and accessible simulation tools could help reduce both casualties and destruction.

This project set out to develop novel techniques to automate disaster simulation, with a focus on creating a high-fidelity, real-time earthquake simulation environment in Unreal Engine (UE). The goal was to enable realistic modelling in real-time using digital twins for rapid scenario testing and better decision-making for preparedness, emergency response and resilient infrastructure planning.

The proposed techniques benefit not only the field of disaster resilience and preparedness but can also be applied to virtual production and gaming contexts. The aim was to bridge the gap between scientific earthquake simulation platforms and the visually rich, real-time environments used in virtual production.

The team successfully built a UE-based earthquake simulation capable of running in real time while producing convincing destruction effects. In demonstration scenes, buildings and walls responded dynamically to simulated seismic waves with realistic fracturing and motion, all while maintaining interactive frame rates. This met the project’s goal of balancing scientific accuracy with high visual fidelity without compromising performance.

A key technical achievement was the integration of real engineering data into UE’s Chaos physics system. The team expanded UE’s fracture material library with results from ANSYS-based analyses, allowing materials in the simulation to break and collapse in ways that closely reflect real-world behaviour under earthquake conditions.

Intuitiveness was prioritised through a streamlined workflow. Users simply assign pre-fractured materials to assets and enter earthquake parameters such as magnitude or seismic datasets. The complex physics processes operate in the background, allowing the tool to deliver realistic results without requiring specialist technical expertise.

The simulation was tested on a range of platforms, including a three-sided projection stage using nDisplay and high-end VR headsets. This confirmed the tool’s flexibility and suitability for virtual production environments. 

The outcome is a validated earthquake simulation platform that integrates a Vision Transformer (ViT) deep learning model to compare simulation results against reality, using both lab-based and real destruction scenes. The result is a scientifically grounded yet creatively deployable tool for disaster simulation and beyond. 

The collaboration is one of seven projects supported by the Embedded R&D (round two) funding call, with grants of up to £60,000 awarded to researchers at UK universities to explore the transfer of knowledge between academia and industry in areas aligned with Virtual Production. The projects took place over a six month period, commencing from September 2024.

Dissemination and commercialisation

  • Two workshops were held in Australia and the United Kingdom with over 100 delegates to engage stakeholders ranging from government officials and policy makers to architects, engineers and the creative community. The workshops gathered requirements and assessed usability for autonomous digital twins. 
  • A journal paper published in Computer & Graphics and a Github repository has been created for future academic and commercial R&D.
  • xQuakeTM is now available at the University of New South Wales in Sydney and the Royal College of Art in London, providing opportunities for future collaboration on the visualisation of extreme events.
  • All videos are available on the Computer Science Research Centre YouTube channel.

Image credit to Yitong Sun.

Categories: Research, Technology