Computer Graphics World

March/April 2014

Issue link: https://digital.copcomm.com/i/298092

Contents of this Issue

Navigation

Page 29 of 51

28 ■ CGW M a r ch / A p r i l 2 014 VISUAL EFFECTS is rehearsing his speech as he walks, and in the process, is oblivious to what is happening around him. But the wide-angle view of the GoPro enables the audience to see what the main character has not. "I wanted [the short] to be in Nick's point of view but didn't necessarily know if it would work, and had to shoot tests to prove that it would, that you could be in someone's POV and see things through the person's eyes that they are not seeing," explains Maschwitz. So, he tested the concept by walking up and down his street while reading a piece of paper, with the GoPro attached to his head. The test shots proved that the idea was sound and that believable visual effects could be inserted into the wide-angle footage of the camera, which is otherwise mainly used for extreme-action video capture. "We did an early version of the scene with the car getting beamed up to the spaceship, and it was funny and seemed to be working. That gave us the confidence to proceed," Mas- chwitz says. In the Neighborhood Producer Raub Shapiro found an ideal setting for the shoot: a quiet neighborhood in Alameda, California. As luck would have it, the location has been used as a set a number of times, and there was even a person from the neighborhood who facilitates shoots there. In addition, she provided access to her driveway, across the street, for the film crew to use as a staging area. Unlike previous Red Giant films, which featured one of the company's tools, "Run Like Hell" instead would be made us- ing a range of offerings from the developer and others – tools that Maschwitz would have selected had he tackled the project independent of Red Giant. Using off-the-shelf software was critical to keeping costs down, and being able to set up at the base camp across the street was especially convenient and enabled Maschwitz to employ Red Giant's BulletProof, a footage prep and delivery tool, running on a MacBook Air, to review and back up the footage on-site. For example, at one point Maschwitz was on his last take for a shot, but after review in BulletProof, it was apparent that he did not get what he needed. "I was able to go back immedi- ately and reshoot without losing my light," he adds. With BulletProof, "I could make notes and add markers. I even had a color lookup table," says Maschwitz. "I knew my edi- tor would see the notes I made, because when you export from BulletProof, the notes come through as metadata and load into [Adobe] Premiere. Unfortunately, this doesn't always happen on set – our shots don't always find their way to the editor, despite our best intention. This time we knew they would." Approximately a dozen people were involved in the project, with six at the shoot and the others involved in the post side later. Front and center at the shoot was Maschwitz, acting as director and actor while wearing the camera, as others maneuvered behind him to stay out of the camera's view. "VFX Artist Zachary Nussbaum's main focus was painting out my producer's leg from a scene early in the film. The field was so wide-angle that when I looked down, I could see his leg," Maschwitz says. The majority of the short was achieved with a single shot as planned, but the end required several shots, with a transition to an all-CG shot of the house exploding. Maschwitz filmed the house and neighboring homes with a Canon EOS 5D Mark III camera, and pieced the photographs together to create a virtual environment. He also shot some of the elements sepa- rately so they could be composited into the footage later. Editing was performed by Gregory Nussbaum and Pictures In A Row. Post Processes Atomic Fiction pitched in for the CGI, with Ryan Tudhope supervising the 3D spaceship render, done via the cloud as op- posed to a local server farm, which likely wouldn't have been able to handle the load. "It was a huge endeavor, Maschwitz points out. "We figured out that to reverse the lens distortion of the GoPro, they would have to render at 3 k, because the render gets scaled down to fit a squared-off CG-type of POV in the GoPro footage. So we rendered a dozen passes at 3k – 600 frames each beautifully lit and textured of a spaceship in a matter of days thanks to their cloud setup." Rendering was done using Autodesk's Maya with the V-Ray renderer. In that scene, the CG spaceship was a TurboSquid model, while the jet models came from Video Copilot. "The magic was in the rendering and Atomic Fiction's ability to turn it into something that looked very real," says Maschwitz. The car lifting off the ground was a 2D image taken from the original footage and animated with some After Effects 2.5D animation. During post, the group used After Effects and Premiere CC extensively. In fact, the project pipeline evolved around After Effects, which was used for the camera tracking, animation, and camera moves. The process, however, actually started in Maxon's Cinema 4D, with Maschwitz using the new Cineware feature within After Effects to bridge to Cinema 4D, where he began to animate the spaceship before the Atomic Fiction team translated it into a Maya scene for final rendering in ■ AN ACTRESS PERFORMS a stunt that was later inserted into the visual effects shot shown on page 27.

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - March/April 2014