CAS Quarterly

Summer 2016

Issue link: https://digital.copcomm.com/i/713016

Contents of this Issue

Navigation

Page 11 of 47

ADAPTIVE AUDIO in Gameplay: How Do They Do It? b y M a t t F o g l i a C A S I In our typical linear world, we know what the scene at hand contains and we know when various sound elements will be present—and when they won't. As new genres of entertainment develop, shifting toward more user-interactive content, that preexisting script can begin to fade. Interactive media along with augmented and virtual reality often require a different type of thought process for those of us accustomed to working in a linear fashion. With these platforms, processors (or game engines) are constantly adjusting and updating the sounds that will play—along with their position and levels—as the "script" changes with each choice made by the user. Add to this, surround and immersive options, and there can be numerous possible sound mixes for the same "scene." While AR and VR are evolving, adaptive audio for video games is more established and may provide a solid jumping-off point for audio implementation in these newer platforms.

Articles in this issue

Archives of this issue

view archives of CAS Quarterly - Summer 2016