Computer Graphics World

May/June 2014

Issue link: https://digital.copcomm.com/i/336920

Contents of this Issue

Navigation

Page 20 of 35

C G W M ay / Ju n e 2 014 ■ 19 her lines. "Animators listened to the audio track and worked with Matthias Wittmann, our facial animation supervisor, to nail the mouth shapes so they hit the dialog," Port says. The eyes were a special concern. As anyone who evaluates virtual humans and digital doubles knows, the eyes can make or break the illusion. For reference, the crew had high-speed foot- age of the actors' eyes filmed extremely closely under specific lighting conditions. "We realized that when someone blinks, the blinking is not just up and down," Hendler says. "When the eyelid comes down, it takes a different path. We tried to mimic a lot of detail that we saw in physically correct ways." Port is particularly proud of the work the team did with the eyes. "We have done the eyes so well that you could have only the eyes on a full 100-foot screen, in motion, and they would hold up," he says. "All those little details – eye water, modeling, and shaders within the eye itself, the corners of the eyes, the wetness, the irises...all that is in there. The whole eye area was critical." As the animators worked, they could see wrinkles and fine details in the skin on the pixies' faces. "We always want to make sure our animators work in a real-time environment, so we spent a huge amount of development to create a real-time version of our facial rigs," Hendler says. "We have all our 3,000 face shapes with dynamic wrinkles working in real time in [Autodesk's] Maya. "And we created CG effects shaders within the Maya viewport. In the past, we would run the animated characters through a light render so the animators could see what they looked like. Now, they can see how wrinkles affect the expressions." Wrinkles and More "One of the great things about the rigs and the fast graph- ics cards in our workstations is that we could implement CG effects that we usually don't see until a shot has been lit, ren- dered, and comp'd," Port says. "Before, the animators would work with a low-resolution animation puppet and would have to guess what the face would look like with its displacement shaders. Now, they have the benefit of real-time wrinkle maps and displacements. They could even see the blood flowing beneath the skin (see "Blood Flow Beneath the Surface" page 18), although most animators turned that off. The wrinkles and displacements, which affected the shape, were more informa- tive. Blood flow was more important when the shots went to lighting and comp. It's pretty cool." In addition to using Maya for animation, the team used that software program for cloth simulation, rigging, and modeling. Pixologic's ZBrush and Autodesk's Mudbox also helped with high-resolution sculpting. Chaos Group's V-Ray rendered all the shots, including, via a custom interface, shots with clouds and effects created in Side Effects' Houdini. Compositors used The Foundry's Nuke. Cloth simulation was complex. "I had never before done a cloth simulation with thousands of lily petals," Hendler says. "And not just petals. Leaves with different thicknesses, spines, veins. We had to mimic all that into their digital wardrobes. We tried to build [real] miniature versions of the wardrobes, sew- ing flowers and leaves together to see how they would move. One guy spent every night for a week building a hydrangea bodice. We were all familiar with fabric, but building with organic materials was something we'd never done before. And kind of hope we never do again." Thistletwit, for example, had a dandelion skirt with grassy layers of leaves, and had layers of fur and hair. "She had 12 or 13 hair and fur grooms," Hendler says. "Every time she bends and moves, millions of hairs interact correctly." An in-house hair grooming system called Samson provided the controls for complex hairstyles to handle the layers of Pixie hair, and to create long, flowing hair for some shots with Angelina Jolie. "The bar is so high these days, the audiences are so sophisticated, there's a high threshold for the quality we have to hit to be competitive," Port says. "The computers and serv- ers are faster, and having more processors helps. And then on top of that, we add technology that we've developed at Digital Domain over the past 10 years." The result of that development is three digital characters that bring us closer to a photoreal digital double than ever before. "I really love the pixies," Port says. "Now that we've nailed it, I'd like to do a whole movie with just the pixies." ■ CGW Barbara Robertson is an award-winning writer and a contributing editor for CGW. She can be reached at BarbaraRR@comcast.net. Shared Environments When Maleficent flies through the fairy-tale world, Digital Domain artists created some of the CG environments surrounding and beneath her. "We had volumetric cloudscapes in stereo," says Dar- ren Hendler, digital effects supervisor at Digital Domain. "She flies down and through canyons above the water, and we see MPC's creatures flying and diving out of the water. We created water simulations with their creatures. And it was all in stereo, so there was no cheating." The studios accomplished the shots using Alembic and deep files. "We could take a deep file from MPC and use that in [Side Effects'] Houdini to hold out our water," Hendler says. "It was some of the most complex sharing we've ever done." – Barbara Robertson

Articles in this issue

Links on this page

Archives of this issue

view archives of Computer Graphics World - May/June 2014