MPSE Wavelength

Spring 2020

Issue link: http://digital.copcomm.com/i/1261193

Contents of this Issue

Navigation

Page 38 of 55

M OT I O N P I CTU R E S O U N D E D I TO R S I 39 "Keeping track of the ADR lines through multiple revisions of the script was the biggest challenge," says supervising ADR editor Steve Slanec. "In the feature world, we are used to marked, very deliberate turnovers of changes. It required some real diligence to make sure we were getting everything we needed in the recording studio and putting it in the right spot." As each iteration of visuals turn over, opportunities for additional dialog and effects reveal themselves and the need to iterate a new temp mix for each new version develops. "The production on this [title] was quick, and happening in parallel with recording the actors (including re-voicing)," Slanec explains. The rapid iteration of dialog recording and editing also required a fluid approach to processing, recalls Bolen: "We tried processing Z0-E3's dialog in Pro Tools using the familiar droid character treatments like C-3PO and K-2SO. However, something about the signature quality of Maya Rudolph's voice kept getting lost. She sounded like a Star Wars droid, but she no longer sounded like Maya. I reached out to Tim Nielsen, supervising sound editor for Solo: A Star Wars Story to find out what he had done for L3-37, voiced by Phoebe Waller-Bridge. We ended up settling on a harmonizer approach using a plugin in the Audiokinetic Wwise audio engine. We could ingest all new WAV files from the latest recording sessions, and apply the processing in the game engine, rather than spending extra time pre-rendering the processing in Pro Tools. If creative director Ben Snow felt the vocal processing was too much or too little in a given scene, or even on an individual line, I could change the settings on that asset in less than a minute and re- render the asset without going back into the DAW. I could easily vary the higher and lower harmonics applied on a line-by-line basis, choosing which dialog needed to be more melodic or more dissonant depending on narrative context." Meanwhile, in the music editorial department the name of the game was hurry up and wait. Due to the player-directed length of many scenes, the composer's materials needed to be edited to fit the variable scene lengths while still providing seamless, narrative- focused transitions to cinematic scenes that weren't scored until late in the production timeline to accommodate time-locked visuals. Just as in feature films, there is a concern regarding the sonic real estate that music may occupy. Similar to the advent of immersive theatrical formats being able to use non-screen channels to provide greater clarity, there is both a creative and sonic value in influencing the experience in VR through score that encompasses or directs the user. The difference in VR is that you're constrained to headphones as opposed to a theater filled with speakers. Using HRTF-encoded (Head-related transfer function) assets vs true stereo assets allowed the Vader Immortal team to spatialize the game's score deep into the virtual world. But this technique comes at a cost Bolen explains, "The tradeoff of HRTF encoding is a noticeable, and often undesirable, coloration of the sound that should be embraced but accounted for." However, these disadvantages are largely outweighed by the openness of an otherwise crowded soundstage. By pushing score outside the immediacy of true stereo, you make room sonically for sound assets that may be a higher priority in the story narrative. But this cannot be a single solution for music treatment, as traditional music moments are still an important device that allows the sound team to bring the score to the forefront and play into dramatic moments. Bill Rudolph explains, "When we want things to seem 'more cinematic,' what we are really saying is using more 3D positioned and 2D sounds rather than HRTF. But when we talk about making things more 'immersive,' we talk about pushing assets through the real-time binaural specialization tools in Wwise." Tucked in the midst of the storyline of Vader Immortal: Episode 1, you find yourself stepping into your first lightsaber dojo. This is the chamber where I found myself grinning from ear to ear. The sonic joy of destroying droids with abandon comes down to some wicked, traditionally edited sound design, as well as procedurally generated sound effects. The hand off between the two lends to a seamless experience of wielding a lightsaber that seems more than virtual. As you advance through the storyline of subsequent episodes, you obtain familiar hilts and different color blades. "All of them Vader Immortal: Episode 1's ZO-E3

Articles in this issue

Archives of this issue

view archives of MPSE Wavelength - Spring 2020