Computer Graphics World

DECEMBER 2010

Issue link: https://digital.copcomm.com/i/22917

Contents of this Issue

Navigation

Page 20 of 43

CG Environments n n n n Pulling It Together Te lighting team lessened the work that compositors otherwise would have done, and compositors, for their part, reduced some of the work that might otherwise have been done in rendering. For example, a special technique developed in compositing changed the inten- sity of the glow lines based on distance. “It was slow to do,” Lambert says. “We would have 20-minute renders to pre-comp. But in CG, it would have taken days and days.” Te compositing team also developed a system that helped them quickly defocus objects in a scene. “We’d pick up a channel from the animators that told us where the camera focused, and then applied de-focus- ing to the appropriate elements,” Lambert explains. “It would be extremely heavy to do that during the render, and it’s way easier if you need to break apart an image if it doesn’t have any form of de-focus applied.” Using the same system but with informa- tion from the camera rather than the anima- tors, the artists composited CG objects and characters into live-action plates. “Tat way, the background CG and foreground CG were optically correct,” Lambert adds. Te team relied on Nuke for all the compos- iting, using Te Foundry’s Ocula plug-ins for Nuke to deal with the stereo shots. “It was the only option,” Lambert says. “It’s so efficient.” Of all the stereo shots, those with bluescreen elements were the most difficult for the com- positors to work with. “Te image shot with one camera gets polarized differently from the other, so we have different specular intensi- ties from one to the other,” Lambert says. “We worked with Te Foundry for about six months and came up with a technique that’s fairly automatic. Tey incorporated that into Ocula, which really helped us out.” Te compositors also occasionally needed to adjust for vertical misalignment—times when one camera was slightly higher than the other. “We’d have to cut out the foreground, fix the background, and put it back together again,” Lambert says. “It was painful.” Troughout the film, they added particles to the air, in stereo, following the conceit that as Flynn had evolved his digital world, he had added atmosphere. In fact, in the climactic shots at the end that have everyone racing to- ward the portal through a sea of simulation, the formerly stark black and “light” environ- ment fills with clouds. “Te shots in the Lightcycle sequence, which didn’t have atmospherics, had about 100 layers,” Lambert says, “the bikes on three layers, reflections through three panes of glass, the stadium, the crowds, and so forth.” But the shots toward the end, with the char- acters racing toward the portal, had even more. “Tose shots had an entire atmospheric envi- ronment with clouds, the portal beam, and the Lightjets in there, as well,” says Lambert. “Tey had the most layers of any sequence. Probably 150 to 200 channels.” Because all the sequences are in stereo, the compositors couldn’t use any of the smoke and cloud live-action elements—usually a favorite technique to add photorealism, whereby art- ists place the photographed elements on cards and insert them into a 3D scene. “We wanted to avoid any kind of card-like De-rez All the material in the TRON world—Users, Programs, buildings, discs, Light- cycles, the Game Grid … everything—is made of 3D voxels. When Sam swings his wand to form a Lightcycle or Lightjet, the vehicles “rez” on—that is, change from wires and cubes into their 3D forms. When one thing smashes into an- other—a disc flung during a fight into a Program, one Lightjet into another—the object explodes into tiny cubes. “They’re like glassy ice cubes,” says sequence supervisor Nikos Kalaitzidis. “The material of the object is on an outer shell, and the inside is a glassy surface.” Effects lead Byron Gaswick’s team developed a procedural particle-based system within Side Effects’ Houdini that uses rigid-body dynamics to manage the colliding cubes when something de-rezzes. But, causing something to rez on or off was a combined effort between animation, effects, lighting, and com- positing. “Animators would have a certain amount of geometry to provide the timing for when the effect starts and ends,” says Kalaitzidis. “Once approved, ef- fects artists did all their crazy things. Once the cubes come out of effects, they go to lighting for rendering in [Chaos Group’s] V-Ray. And that output goes to compositing for pretty glows, color, and flares. And then you see piles of cubes acting like they would in the real world.” –Barbara Robertson feel,” Lambert says. “As soon as you put an ele- ment into a certain z-depth, it looks flat, es- pecially when everything else is rendered cor- rectly. So we rendered all the atmospherics in the film with [Side Effects Software’s] Houdini and in-house software.” Te climactic sequence takes visual effects beyond anything the filmmakers on the origi- nal TRON could have imagined possible—one of many such sequences in TRON: Legacy. But those creating this stunning film stand on the shoulders of that original crew, CG pioneers such as Robert Abel, Dave Inglish, Ken Perlin, Judson Rosebush, Frank Vitz, Chris Wedge, and many other highly respected people in the computer graphics community. Te Acad- emy voters snubbed the groundbreaking work done by that crew because they couldn’t see the future. Te future is now on screen in stereo- scopic 3D. It’s a good bet the crew of TRON: Legacy won’t suffer the same fate. n Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at BarbaraRR@comcast.net. December 2010 19

Articles in this issue

Links on this page

Archives of this issue

view archives of Computer Graphics World - DECEMBER 2010