Post Magazine

August 2012

Issue link: https://digital.copcomm.com/i/78511

Contents of this Issue

Navigation

Page 20 of 59

worked on at 5.6 x 4K. The anamorphic 35mm footage was scanned at Warner Bros. Motion Picture Imaging in Burbank on a Film- light Northlight at 4K and all the scope shots worked on at 4K. The VistaVision scanning was also done at MPI at 6K. All VistaVision material worked on at 6K and all the VistaVision mate- rial was reformatted to IMAX. Franklin notes that a 2K scan is 10MB of data while an 8K scan can be up to 220MB of data. Suddenly you've got a couple of giga- bytes of data just for a second of film. Obvi- ously that increases your render times and your storage. "I think in the end we ended up with a half a Petabyte (500TB) of online stor- age that we could actually use to manipulate the data. In terms of the total amount of data, we must have created easily a thousand Tera- bytes of data on this film." Another challenge is that there are no digital systems, monitors or projector,s which allow you to see a full resolution IMAX frame. "You can only see little bits of the picture, " explains Franklin. "So you have to develop a feel for it. You have to say, 'Okay, I'm looking at this low resolution proxy, ' — we consider a 2K to be a low-resolution proxy of the final image. While looking at this low-res proxy, I'm going to have a good feel for what it is going to look like when it goes out to IMAX and we see it projected." What they did for dailies and VFX comps is down res the images, print them out onto standard 35mm and then project them. "And every so often we'd make selects and see those at full-resolution IMAX in whatever IMAX theatre we could get access to, wheth- er it was the AMC Lincoln Center in New York, or the AMC CityWalk in Los Angeles, or the BFI IMAX in London." TECH INNOVATIONS One of Dneg's technical innovations was a physically based raytracing system within Pix- ar's RenderMan. "Basically what is does is it allows us to accurately calculate the exchange of energy between surfaces. This produced an absolutely extraordinary level of consistency in the 3D renders this time." In the past, Franklin had to carefully scrutinize a CG image and place it against the live-action photographic reference and tell the guys, "The shadow in the wheel arch of the Batmobile needs to be a point darker. The highlights on the windscreen need to be two points lighter. That's got not quite enough contrast." They addressed individual surfaces on every object and graded them by hand to get them to look right. This time around, he says, the system was so spot-on that instructions were: make it a bit darker; make it a bit brighter. "Once they got this thing set up, it was so fast to work with that we could show Chris an animatic, a grayscale animation of The Bat flying down the street. He'd say, 'That great. Let's get that rendered,' and then I'll be able to show a lit, rendered, composited version of the shot 24 hours later." Another innovation was the new onset high dynamic range (HDR) capture system. Basically, they would take photographs on set across the whole range of F-stops in order to characterize the lighting environment and to capture the range of exposure that they would eventually see on film. In the past this was done by taking a chrome sphere out onto the set and then shooting it with a camera. The sphere would give a 360 image all around, but it would be low resolution and only as good as the sphere itself, which can pick up dings and fingerprints. They evolved their procedure into using fisheye lenses. They would have a camera on a nodal mount pivoted around the nodal point of a lens with a fisheye lens on it. Then they'd shoot a series of exposures panning the cam- era 90 degrees in between exposures. This time they built a little robot that has four cameras with fisheye lenses in a casing mount- ed on a tripod. It's driven by a proprietary computer system called DN Snapper. After each take they'd run out to the mid- dle of the set, plant the robot down, and then the computer would run all the exposures. The process takes about a minute. In the past it would take up to 30 minutes to capture the lighting environment. "That gave us very accu- rate lighting maps which then fed into the new physical rendering system and it made a tre- mendous difference to the way that all the CGI looked in the film," recalls Franklin. Dneg uses Autodesk Maya as their main 3D package. They also use Side Effects Houdi- ni. "That's a very important piece of software for doing all of our dynamic simulations. You know, blowing up buildings and crashing things and stuff like that." They have their own in-house fluid dynam- ic system called Squirt, used to create the big digital pyrotechnics, as well as a lot of their own proprietary tool sets. They also are using Nuke as their main compositing package. "But again it's our own flavor of Nuke that has a lot of our own color management systems and things going in there." Franklin says there are approximately 20,000 CPU cores in their renderfarm. The main operating system is Linux and most of the workstations are regular PCs running 3.46GHz dual 6-core CPUs with 48GB of memory and Nvidia graphics cards. Double Negative calls on Maya for 3D, with some Houdini. Nuke is their main compositing system.

Articles in this issue

Archives of this issue

view archives of Post Magazine - August 2012