Computer Graphics World

January / February 2016

Issue link: https://digital.copcomm.com/i/639267

Contents of this Issue

Navigation

Page 27 of 51

26 cgw j a n u a r y . f e b r u a r y 2 0 1 6 the animation team used Premo, DreamWorks' new in-house anima- tion system. "Animators can now touch the sculp- ture that is a panda and truly interact with the model itself," Carloni says. Supervising Animator Jason Reisig ex- plains that although he can use a mouse and keyboard shortcuts, he doesn't need either. Instead, he performs a character using Wacom's Cintiq tablet. "I want to touch something and move it," Reisig says. "I can pose right on the character. Move him around, grab the corners of his mouth, change expressions, move an arm. I can do this mirrored or not. The process is very tactile. It feels like working with a stop- frame puppet, but I'm posing at nearly 24 frames per second." Reisig was a member of the advisory board that helped develop Premo. "We first began talking about this six or seven years ago, and tried to deter- mine what would be most important," Reisig says. "We gathered a ton of information and guidelines for early development. The top 10 requests were about speed and the fidelity of the characters, about interacting with characters in real time in high resolu- tion. We knew R&D would have to build tools from scratch, and we couldn't do everything at once." R&D had Premo ready for animators on How to Train Your Dragon, and feed- back from that production influenced the next generation. Animators on Kung Fu Panda 3 were especially excit- ed about Premo's new shot browser. "Historically, we would open a shot, load assets, work on it, save it, close it, open a new shot, load assets, and so forth," says Supervising Animator Ludovic Bouancheau. "If I wanted to copy and paste across shots, I'd have to go to one shot, copy, close it, then open another shot. I'd be doing that blindly. Now, we have an open pipeline to multiple shots. Premo looks at all the assets and loads everything. So, we can access all the shots in a sequence at the same time. I can look at an en- tire sequence and draw on it, and I can copy and paste across shots." Although animators are "in" only one shot (that is, one file) at a time, the rest of the sequence is available and view- able in a shot browser on-screen with the shots' current stage of production – whether storyboard, layout, final, or something in between. Animators can scrub through a sequence and instantly switch from the shot they are working on to another. They can't override an- other person's work, but they can see it. "We can choose categories that we want in our browser," Bouancheau says. "I can choose which layout version – previs, rough, or final layout. I can look at three types of rendering. At effects, character effects, lighting. I can add categories. On Panda, we were doing a lot of 2D animation for fighting chore- ography, so I could play that animation in the shot to reference it." This ability to view an entire sequence had an impact beyond an individual animators' ability to work on shots. "It changed how we cast and run the department," Reisig says. "Now, animators handle groups of shots, not one here and there. They can go from one shot with a lot of characters EFFECTIVE TOOLS "Looking back," says Visual Effects Supervisor Mark Edwards, "this was a pretty big effects movie. We had snow, rain, splashes, destruction, blowing sand, dust, waterfalls, and petals, lots of petals. I think we pushed the state of the art in a few areas. The jade palace destruction, for instance. The number of assets is the highest we've had at the studio, and we had massive amounts of rigid-body destruction. Especially close-up. We had a bazillion scrolls. Debris. We had so many materials – paper, stone, wood. Our sims showed flexing and splintering." The effects department has based many of its tools on Side Effects' Houdini. Flux, a fast, hybrid gas solver with options for combustion, handles various volumes with a tight integration into Houdini via SOP nodes. The artists used Flux for smoke, fire, clouds, dust, and so forth. An internal solver called Rapid, also integrated into Houdini, simulates liquids. Character effects artists relied on the FXGear Qualoth plug-in for Maya for cloth simulations. Two proprietary hair and fur set-up and groom- ing systems, Fur and Willow, provided guide hairs for Houdini-based simulations. "We optimized our Fur caching to handle lots of pandas and the grass in panda village," Edwards says. "[R&D Engineer] Feng Xie helped us devise a way to cache out a lot of data at one time. It was a huge optimization." For lighting, the team uses a proprietary tool called Torch; for rendering, the studio's own point-based global illumination (PBGI) system called Moonlight. "We developed a way to trace shadows with PBGI data, updated the renderer to deal with AOVs (arbitrary output variables), and pushed out some fixes to do deep compositing in [The Foundry's] Nuke," Edwards says. "Also, for certain things, we used [Side Effects'] Mantra. With our graphic look, it was imperative to push the color, so we did a lot of work in Nuke to keep the saturation." A small team working in Adobe Aer Effects handled the integration of elements into the 2D scrolls, and the inky stylized credit at the end. "When you see the 2D scroll in stereo, it looks really cool because of the Aer Effects ele- ments, even though it's flat," Edwards says. ARTISTS OFTEN RELIED ON MATTE PAINTINGS FOR A COLORFUL GRAPHIC STYLE.

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - January / February 2016