Computer Graphics World

March 2011

Issue link: https://digital.copcomm.com/i/27779

Contents of this Issue

Navigation

Page 41 of 51

n n n n Performance Capture duced the shading and lighting via an in-house tool called Isotope that moved files from Maya. “We know what ‘real’ looks like,” Baillie says. “Being a half-percent wrong puts a char- acter in the Uncanny Valley. So, we made the decision to stylize the characters and have a bit of fun with them. When you have characters that are more caricatures than photoreal hu- mans, the audience lets the character off the hook a little bit.” To help animators see how the eyes would look once rendered, R&D engineer Mark Col- bert created a virtual eye. “Usually animators work a little bit blind,” Baillie punned, and then explained that animators have geometry for the iris, pupil, cornea, and so forth. But the animation packages don’t show the refraction from the cornea to the pupil, and that pushes the pupil. Tus, animators often have to ad- just the eye line after rendering. “Mark [Colbert] created a way for anima- tors to see the effect of the changing refraction in the Maya viewport using a sphere with a bump for corneal bulge and a CGFX shader,” Baillie says. Te CGFX shader produced the results in real time. Much of the film takes place in the shiny- metal Mars underground, which became an interesting rendering predicament. “Render- Man is phenomenal at displacement, motion blur, and hair, but shiny things and raytraced reflections are challenging,” Baillie says. “We couldn’t use spotlights. We had to have objects cast light. So we implemented point-based lighting techniques for reflections.” Christophe Hery, who is now at Pixar, had joined ImageMovers Digital during produc- tion and helped the crew implement an evolu- tion of techniques developed while he was at Industrial Light & Magic, and for which he had received a Sci-Tech award (see “Bleeding Edge,” March 2010). “We rendered scenes with fully reflecting walls and floors in 15 minutes,” Baillie says. “It was unheard of. Tat optimization really saved our butts. We did all our indirect illumination using point clouds generated from all the lights in the scene that had to emit light. We’d bake the floor into a point cloud, and had those points cast lights.” When the characters Gribble and Milo find themselves in an abandoned Martian city with glowing lichen and other bioluminescent veg- etation, the crew handled the lighting by baking each little plant into a point cloud that cast light onto the ground. Tey also used point clouds for subsurface scattering on the characters’ faces. “Christophe really helped out a lot with that,” says Baillie. “We had the characters sing- ing on all four cylinders because we had the guy 40 March 2011 (Top) The design for the aliens meant they had to be CG characters; they couldn’t be people in suits. Animators scaled the mocap data appropriately. (Bottom) Gribble and Milo are two of the four human characters in the film, but they are caricatures, too, and that helps them avoid the Uncanny Valley. who invented the technique working with us. We started with a shadow-map version of sub- surface scattering but ended up preferring the look and speed of point-cloud-based subsurface scattering.” When they reached the limitations of the point-cloud techniques—lips might glow when they touched—Pixar’s RenderMan development team jumped in to help. “I think the thing that had the biggest im- pact on the film at the end of the day, and the scariest at first, was the extensive use of indi- rect lighting with point clouds,” Baillie says. “Te studio had put a lot of hours into a sys- tem they had used for A Christmas Carol, so it was hard to persuade them to change. But I’m super glad we pursued it.” Moving On While they were still in production, the crew learned that Disney would close the studio. “It was sad,” Wells says. “I understand Disney’s decision from a business point of view. To keep the studio running, they’d have to guarantee a tent pole every year. Tey didn’t want to carry another standing army of 450 artists. But from the point of view of creative artists, this crew was working together so efficiently and fluidly, producing such high-quality work, and it was heartbreaking to see it broken up.” Already Baillie has joined with two other former crew members from ImageMovers Digital: Ryan Tudhope, who came to IMD from Te Orphanage, and Jenn Emberly, who was performance supervisor at IMD, and be- fore that, animation supervisor at ILM. Te three have founded Atomic Fiction, a visual effects studio based in Emeryville, California. For his part, Wirtz founded Bayou FX in his home state of Louisiana, in Covington, near New Orleans. It’s likely that the former IMD’ers will con- tinue networking in interesting ways, as they take what they have learned at the studio and expand it out into the universe. “Every once in a while a bunch of us get together and remi- nisce about the awesome things we did togeth- er,” Baillie says. “One of the hardest parts of my job was figuring out which amazing idea to go with. It was a con- stant barrage of consistent amazement.” n Barbara Robertson is an award- winning writer and a contributing editor for Computer Graphics World. Use your smart- phone to read accompanying story.

Articles in this issue

Links on this page

Archives of this issue

view archives of Computer Graphics World - March 2011