Immersiveness of Audio Implementations across Video Game Engines Presented by Mark Schlax
Outline Refresher of Acoustics ● ● What is a Video Game Engine? ● Why Real Time Audio is Difficult Audio Emulation Across Engines ● Audio Showcase ● ● Comparing Differing Techniques ● Implementation effect on Immersiveness References ●
Refresher of Acoustics Sound energy travels as a mechanical wave ● ○ Thus it requires a medium, typically air It interacts with surroundings: echoing, doppler effect, more... ○ ● The human hearing range is typically between 20Hz to 20,000Hz ○ Below 20Hz is generally felt instead of heard ○ The hearing range is greatly affected by environmental factors ● Perception of sound intensity is logarithmic, not linear Use decibels (dB) as scale ○ ○ A double in intensity is 3dB, a million times is 60dB
What is a Video Game Engine Software Development Environment Suite ● ○ Graphic Renderer Physics ○ ○ Sound Scripting ○ ○ Animation Artificial Intelligence ○ ○ Networking Much more… ○
Why Real Time Audio is Difficult The physics engine would need to have mechanical waves across a medium ● ○ Properly done, this would require simulation at an atomic level Such engines exist, but rely on static scenes or backed by supercomputer ● Video games typically contain dynamic environments and interactions ● ● Real time further complicates the necessity of fast computations
Audio Emulation Across Engines Popular “Public” Engines, free up to a point ● ○ Unreal Engine 4 Unity ○ ○ NVidia VRWorks Not an engine, more on this later ■ ● Proprietary ○ Electronic Arts Frostbite
Audio Emulation Across Engines: UE4 Unreal Engine 4 Supports Sound Asset Types: ● ○ Sound cues Sound nodes offer behavioral modifies to final output ■ ○ Reverb Effects Echo density, air absorption, reverb gain, more... ■ ○ Sound Attenuation Ability for sound to lower in volume based on distance from listener ■ ● Min and Max radius with a distance algorithm in between Min radius for attenuation start, max radius for end ○ ○ Linear, logarithmic, log-reverse, inverse, natural Allows adjustment to attenuation in a reusable manner ■ ● Define for one sound asset and use for another
Audio Emulation Across Engines: Unity Unity emulates audio via Audio Sources and Audio Listeners ● ○ Emitted by one object and received by another Audio filters can be applied to a given object ○ ■ Attenuation is done by rolloff volume-distance curve Min/max radius, with three algorithms: linear, log, custom ● ■ Ducking Groups sounds together, modify group volume as whole ● ■ Snapshots Settings of group sounds can be saved ● ● A list of snapshots can be transitioned between
Audio Emulation Across Engines: Frostbite Frostbite engine is closed ● ○ Given goals with engine are known: high variability with low predictability Ducking and Mixing work well with predictable soundscape ○ ○ High Dynamic Range as early as 2009 with Frostbite 1 Sounds need to feel loud, while still being able to hear quiet sounds ■ ■ Measure each sound’s perceived loudness, give each sound a priority HDR window is defined by the minimum, size and release time ● ○ The loudest sounds push the window away from quiet sounds Distance is inferred by loudness ● ○ No Min/Max radius, it will be heard or culled! Less control for a particular sound ○
Audio Emulation Across Engines: VRWorks Nvidia VRWorks library for Unreal Engine 4, one package is Audio ● ○ Focuses on emulating sound in 3D space using Nvidia OptiX ray tracing Path of sound is traced in real-time from source to listener ■ ■ When ray interacts with object, apply a filter to the sound How Sound Moves in Space Modeling Propagation
Audio Showcase: Unreal Engine 4
Audio Showcase: Unity
Audio Showcase: Frostbite 1 & 3
Audio Showcase: Nvidia VRWorks - Audio
Comparing Differing Techniques Public Game Engines ● ○ Audio functionality is largely similar Unity allows for a custom attenuation, UE supports larger range of default curves ■ ■ Both allow for reverb effects, ducking, equalizer of volume and pitch Large control over sound properties ○ ■ Each sound must be calibrated or loaded from saved parameters Frostbite & VRWorks ● ○ Introduce a paradigm shift ■ Less control for the developer ■ At any time, the sound may or may not be heard based on environment ○ Inclusive ■ Frostbite is EA’s Proprietary engine ■ VRWorks only available as package for UE, and does not work with AMD Radeon
Implementation effect on Immersiveness Immersiveness is highly subjective for video games ● ○ Usually visual senses take priority in immersion Goal has been to more accurately emulate sound ● ○ Naturally immersion will come along with it ● Progress has been made in Public and Proprietary engines When an engine incorporates a feature, another is quick to adopt ○ ■ Exception for paradigm shifts such as HDR with Frostbite Computers are becoming more powerful ○ ■ Real time ray tracing is computationally taxing Not as much as simulating per particle physics ●
References Images: ● ○ http://luboslenco.com/notes/2015_04_25/editor.jpg ○ https://developer.nvidia.com/sites/default/files/akamai/VRWorks/Training/VRWAudioPropag001b.png ● Unreal Engine 4 https://docs.unrealengine.com/en-us/Engine/Audio/DistanceModelAttenuation ○ ○ https://docs.unrealengine.com/en-us/Engine/Audio/Overview ● Unity Engine ○ https://docs.unity3d.com/Manual/AudioOverview.html ○ https://docs.unity3d.com/Manual/AudioSpatializerSDK.html Frostbite Engine ● ○ https://www.ea.com/frostbite/news/how-hdr-audio-makes-battlefield-bad-company-go-boom
Recommend
More recommend