Microsoft have just released Project Acoustics, a wave based 3D audio system for the Unity and Unreal game engines. In a method somewhat similar to raytracing, Project Acoustics simulates audio by defining audio properties in your scene and then simulating millions of waveforms within the environment. It only works on Windows versions of Unity and Unreal Engine, but is capable of targeting Windows, Mac, Android and Xbox.
Details of Project Acoustics:
Project Acoustics is a wave acoustics engine for 3D interactive experiences. It models wave effects like occlusion, obstruction, portaling and reverberation effects in complex scenes without requiring manual zone markup or CPU intensive raytracing. It also includes game engine and audio middleware integration. Project Acoustics’ philosophy is similar to static lighting: bake detailed physics offline to provide a physical baseline, and use a lightweight runtime with expressive design controls to meet your artistic goals for the acoustics of your virtual world.
Ray-based acoustics methods can check for occlusion using a single source-to-listener ray cast, or drive reverb by estimating local scene volume with a few rays. But these techniques can be unreliable because a pebble occludes as much as a boulder. Rays don’t account for the way sound bends around objects, a phenomenon known as diffraction. Project Acoustics’ simulation captures these effects using a wave-based simulation. The acoustics are more predictable, accurate and seamless.
Project Acoustics’ key innovation is to couple real sound wave based acoustic simulation with traditional sound design concepts. It translates simulation results into traditional audio DSP parameters for occlusion, portaling and reverb. The designer uses controls over this translation process. For more details on the core technologies behind Project Acoustics, visit the research project page.
You can learn more about Project Acoustics and see both the Unity and UE5 examples in action in the video below.