Computer Graphics World

DECEMBER 2010

Issue link: https://digital.copcomm.com/i/22917

Contents of this Issue

Navigation

Page 10 of 43

CG Environments n n n n So that crews at the outsource partners and at Digital Domain could publish assets directly into shots and move into shot produc- tion quickly, previs and layout supervisor Scott Meadows led a team that blocked out the en- tire film in stereo. “Te film was conceived to be filmed with a 3D system from the first,” Kosinski says. “We spent a year getting ready. We shot the film in 75 days, and then spent a year and a half in postproduction.” Also to help speed postproduction, se- quence supervisors at Digital Domain gave the outsource partners assets and asked them At left, actors fought disc games on bluescreen stages. Artists rotoscoped the images to create ac- curate reflections on the glass floors. At right, crowds of digital agents surrounding the ever-changing courts cheered the action. to switch to Te Foundry’s Nuke, Autodesk’s Maya, and Chaos Group’s V-Ray. “Addition- ally, we gave them scripts in Nuke to handle vertical disparity [for mis-aligned stereo cam- eras],” Barba says. “We got them up to speed on a short schedule.” On Set Kosinski’s director of photography, Claudio Miranda, shot the film in stereo using six Sony used to reconstruct the geometry. We brought those photos into Track and used them as an additional camera.” The artists could also use data from the witness cameras and survey data—measurements of the set—which gave the tracking team the distance between objects in an image. “We have a strong survey team,” MacKenzie says. “We can feed their measurements into the software, and Track will solve us- ing those parameters.” In fact, the artists could add new points at any time, and those points could affect the track. “Tracking becomes an organic pipe- line of adding and building enough points,” Roble says. “The solv- ers take in identified points on the image and any other information we have in the scene, and as you add new data, the original tracks get better. It’s an iterative process.” Once the artists tracked the camera, they ran test foot- age and looked for any problems, checking to see if the track looked good in both stereo “eyes,” that is, from the left- and right-eye camera views. This was especially true for the tracked head. To replace Reardon’s head with Clu’s CG head, the team needed to know the exact position of the body double’s head in every frame. “When these guys were tracking Reardon’s head, they’d fig- ure out where the camera was, and then figure out where the head was in the scene,” Roble says. “But, tracking deformable objects like a head is particularly difficult. If it moves, or if the eyes move, or if the hair moves, the track is squishy. There’s no solid place on the head that you can rely on. The points shift all over the place. There’s a point at the top of the nose where it meets the eyes, but even that part moves when someone frowns. The back of the head attaches to the neck, and the neck changes shape all the time.” CineAlta F35 cameras on three Pace Fusion 3D rigs; the second unit used a fourth rig and cameras. “Joe [Kosinski] wanted to build sets for the actors to interact with, to ground us and to lighten our load, so we shot much of the film on sets,” Barba says. Te disc games and Lightcycle sequences, however, are pre- dominately full-CG, as are two climactic se- quences at the end of the film, which include a spectacular Lightjet dogfight. To have Clu’s digital hair fall properly from his CG head onto the image of Reardon’s collar and jacket, the trackers also needed to precisely determine where the body double’s cos- tume was in 3D space. “We had to be careful that there was no high-frequency jitter that would cause the hair simulation to freak out,” MacKenzie says. Even though a track might look good visually, any vibration in the collar would affect the simulation. “The hair is like tiny springs,” Roble says. “If there is any vibration, the hairs would be like a needle on a record player bouncing like crazy; they’d do a little dance. The precision had to be rock solid. Everything is sub-pixel accurate.” If the animators saw that the CG model didn’t match the body double’s head—Clu’s chin might not dip as much as Rear- don’s, for example—they’d send the track back for a better fit, even if the track was correct. “You’re putting someone’s head on another person’s body,” MacKenzie says. “Nuance plays into it. Sometimes you need to be less objective and more ar- tistic. Otherwise, they just don’t look right together.” Before Roble updated Track, he spent time researching com- puter-vision literature and technical papers, looking for ways to compute the tracking calculations. He didn’t find much help. “There isn’t much research in deformable head tracking, deformable object tracking,” he says. “A lot of the computer- vision literature deals with tracking rigid objects or reconstruct- ing scenes. I found other papers on tracking deforming things, like cloth. But most of that research was in constrained circum- stances. We had moving cameras, moving sets, moving every- thing, and we had to track deformable objects. We could use some help from the academic community [on this problem].” —Barbara Robertson December 2010 9

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - DECEMBER 2010