CAS Quarterly

Summer 2019

Issue link: http://digital.copcomm.com/i/1152266

Contents of this Issue

Navigation

Page 32 of 51

AFTER EXPLORING MY AURAL ID, I HAD A COUPLE FOLLOW-UP QUESTIONS. ONCE WE COORDINATED OUR SCHEDULES (GENELEC'S HQ IN FINLAND IS SEVEN HOURS AHEAD OF MY CENTRAL STANDARD TIME IN NASHVILLE), I SPENT SOME SKYPE TIME WITH DR. AKI MÄKIVIRTA, GENELEC'S DIRECTOR OF R&D. WHO DO YOU SEE AS YOUR MAIN AUDIENCE FOR AURAL ID? There is an application area that is particularly a good fit for this technology and that is game engineering. Games contain synthesized room effects and immersive audio content. Using a personal HRTF contained in Aural ID for these applications is a really straightforward way of increasing the precision and productivity of sound engineering work. The second application would be engineering of binaural audio content. The personal HRTF in the Aural ID offers, again, a more reliable presentation and can increase productivity. In the case of binaural audio, the sound is presented using headphones. Today, audio is increasingly recorded using higher order Ambisonic microphone technologies. Aural ID suits perfectly to monitoring such recordings binaurally and offers improved reliability. HOW ABOUT USING AURAL ID FOR TRADITIONAL MIXING? Your thinking is in the direction of directly replacing loudspeaker-based conventional monitoring with headphone- based monitoring. A lot of work is going on at the moment around the world to develop different aspects of such technology. A significant part of this work has been to reduce or remove the unit-to-unit variations of [different] headphones' sound characters, as this is usually the first problem that pops up. There is also other software that is essentially claiming to be able to create a number of virtual loudspeaker sources around the listener. Some software also offers a head tracker to update the HRTF in real time to create an illusion of the sound reproduction remaining stationary in the room when the head turns. Once these allow the use of personal HRTF data, the quality and reliability of these methods can increase. HOW CAN USERS CREATE AN ACCURATE SPACE TO PUT THEIR AURAL ID "IN"? At the moment, there are several companies that offer technology to record 3D impulse responses in rooms. Such technology can be used to capture the room influence in a way that can be used in rendering the room over headphones. Each early reflection has a certain level, timing relative to direct audio, and orientation relative to the listener. The information in Aural ID enables the presentation of this information correctly for the headphone listener. Naturally, a set of early reflections is unique to each audio source (loudspeaker) in the room. The number of early reflections to handle is significant, but technology suitable for handling this has been developed, for example, in the area of Ambisonic reverberation software. HOW WOULD YOU SEQUENCE PLUGINS TO ACCURATELY REPRODUCE LISTENING IN A ROOM IF YOU HAD AN IR AND AN AURAL ID? If we think of the correct order of arranging the signal processing steps, first we need the simulations of the real sources (loudspeakers) we want to listen, then we need the room-related early reflections generated for all these sources, and finally, we render all this audio information for headphones. The rendering uses Aural ID-contained information about the HRTF for all the directions of arrivals relative to the listener. Aural ID is a complete presentation of the HRTF for each ear individually, and it is calculated in 836 different orientations for each ear, covering the whole sphere around the listener fairly densely. • space (such as your dub stage) and process that IR with your .SOFA file, then you could replicate how you would hear things in that specific space. While this could be great, it is not the sole purpose of Aural ID, as you will learn if you read the following interview with Genelec's Director of R&D, Dr. Aki Mäkivirta. Using SPARTA Binauraliser to read the .SOFA file.

Articles in this issue

Archives of this issue

view archives of CAS Quarterly - Summer 2019