Computer Graphics World

January/February 2015

Issue link: https://digital.copcomm.com/i/457057

Contents of this Issue

Navigation

Page 13 of 51

12 cgw j a n u a r y . f e b r u a r y 2 0 1 5 lightbulbs rather than the sun illuminated the CG characters and sets in many shots. "The spherical harmonics pipeline we had written for Avatar wasn't ideal for interiors," Letteri says. "You really want to use ray tracing, and the kind of distribution ray tracing we were working with in old versions of [Pixar's] RenderMan were ex- pensive to do. You fire a ray, hit something, [fire] a bunch more rays and hit more things, and you end up with a dense, heavy tree structure that's accurate but hard to work with." So, the Weta researchers be- gan investigating path tracers. "With this method, you construct bidirectional paths of light from the camera to the light, or vice versa, and solve the integral equations to compute the light transport," Letteri ex- plains. "It's more promising, and there was more research into the mathematics for solving those problems." The goal was to load a com- plex scene, do the path tracing, and then use that as a guide into how they might break down lighting to push shots through as usual. Before writing their own so- ware, the R&D team looked at PBRT and Mitsuba. Matt Pharr, Greg Humphreys, and Pat Han- rahan received a Sci-Tech Award for PBRT, which is now detailed in Pharr and Humphreys' book "Physically Based Rendering." Dr. Wenzell Jakob's Mitsuba is a research-oriented rendering system in the style of PBRT. "They do path tracing, and they allowed us to test those algorithms," Letteri says. "But we needed something more production friendly. We needed to test path tracing with real shots and real scenes. So, we wrote a path-tracing engine that is RenderMan-compliant so we could feed it RIB files and our RenderMan shaders." And that made it possible to test the new engine using real production assets. "Once we started doing that and saw where we could go, it became apparent that we were getting enough speed," Letteri says. "I kind of knew it would happen, that it would become viable for production because it was a two- to four-year project. By the time we got all the elements, I was hoping the hardware would be faster and it would all come together. And, it did. We have a full bidirectional path tracer now, and we can put any sampling algorithm into it." And that means, with Manuka, the path tracer they developed, the experiment can continue alongside its use in shot production. "We wrote Manuka with two goals," Letteri says. "One was to be a research test bed in which we can do whatever we want. We can do brute-force sampling and see an exact answer, even if takes days and days. That's called an unbiased renderer. But then, also, we can bias the renderer and trade accuracy for speed. We know what [the image] is supposed to look like, but we're willing to do something different to get the shots done. We can switch between." Because the team designed Gazebo with RenderMan and Manuka in mind, it could output a scene to RenderMan or the fully RenderMan-compliant Manuka. "That allowed us to do quick comparisons and to migrate quickly," Letteri says. "We could test a shot in Manuka and if we ran into problems, fall back to RenderMan. We didn't want to have to rewrite all our RenderMan shaders. Every time we looked at new rendering al- gorithms when something new came along, it would mean we'd have to rewrite shaders to tease apart surface descriptions and light transport. We did that early on, which allowed us to migrate between renderers more easily. So, now, we can use our PRMan shaders in Manuka." The studio used Manuka for the first time on the big crowd shots in Dawn of the Planet of the Apes and discovered it could render the fur on large groups of apes efficiently. Thus, when it was Five Armies' turn a few months later, they rendered 90 percent of the film in Manuka, relying on RenderMan only for the fire elements in Lake-town because they hadn't yet opti- mized Manuka for volumetrics. "As of Five Armies, it has become our renderer," Letteri says. "We're working on both in parallel, but Manuka proved its worth so much, that's what we used wherever we could. Be- cause it handles big scenes, the TDs spend less time managing the scene and more time on WETA'S GAZEBO PROVIDED REAL-TIME LIGHTING FOR VIRTUAL PRODUCTION; MANUKA RENDERED SCENES USING PATH TRACING.

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - January/February 2015