Computer Graphics World

April/May 2012

Issue link: https://digital.copcomm.com/i/65907

Contents of this Issue

Navigation

Page 38 of 51

Animation ■ ■ ■ ■ When did you see your fi rst shots? We spent a long time planning. I didn't get my fi rst shot off the fl oor until I had been there nine months. It was exciting. Did you feel like you were working in slow motion? Quite the opposite. Th ere was a lot to do for a brand-new visual eff ects department. Every shot went through visual eff ects, every shot had a little bit of cleanup. So, we had over 1500 shots to work with. I was running dailies on a [FilmLight] Baselight machine, and I'd do bits of grade between shots. Th ere was no slow-motion feeling at all. It was a very busy fi lm. We had 40 units shooting, so we had lots of shots coming from the fl oor every day. Was working on this fi lm like working on a live-action fi lm or an animated fi lm? I treated it as a live-action fi lm in many ways because there was a large amount of greenscreen work and the lighting was real. To get the high quality we wanted, we needed to think live action. All the lights on the shots were computer-controlled, and the 'tower' had information about lighting levels, so shots with similar lighting used the same levels. Th e lighting was fabulous. If we had fully digital shots, we could wait for plates to turn up and match to shots around them. Did you do previs? We did, to help tell the story and to work out lots of things. We' and then in [Autodesk] Maya do proper pre- vis, high-quality animatics for the entire fi lm. Th e previs gave us continuity. d start with storyboards What are the major diff erences between creating visual eff ects for stop-motion and for live-action fi lms? Th e world, the scale of the world, is a big diff erence. Th e characters are smaller and they're puppets. And, normally, on live- action fi lms when you do a visual eff ects shot, you start work knowing the shots are locked. But, at Aardman, editors could re-animate the shot. We use digital stills; it's stop frame. Ani- mators move a character and take a photo- graph. Th e images come off the fl oor using a proprietary piece of software that sends it to editorial, and editors could add frames and pull frames within a shot. So, if the anima- tor shot frames 1, 2, 3, 4, and 5, because the camera was locked off , the editor could change that to 1, 1, 3, 4, 5. Th at meant the offi cial take wouldn't come immediately off the fl oor—unless it was a sweeping camera. Getting a shot lock was especially important for us when we had animated water. It might be worth mentioning that they shot the fi lm in stereo, so there were two pictures for every animation frame: one for the left and one for the right eye. Did you produce photorealistic water simulations? Th e brief was to make it look fabulous in an Th ey were digital for things they couldn't shoot. For example, we had some wide shots of the ocean with a big ship, and the green- screen sets couldn't be big enough to put a camera far enough away. Th erefore, we' with a digital ship and digital water. Th e island was predominately digital. Aardman way. We needed to give the CGI the creative style of an Aardman project. It's not quite photoreal, but it has photoreal qualities. We had close-up water, distance water, and oceans of water. We didn't want to do some- thing that looked like a real ocean, and we had to reduce the scale of the water to match the scale of the Aardman world. Also, we wanted a more plastic feel, a little more squash and stretch to give it a more modeled feel. fects] Houdini and [Pixar] RenderMan. We used [Next Limit's] RealFlow for the close- up splashes, and it all came through [Th e Foundry's] Nuke. But, most of the water was proprietary. What software did you use for the water? We had a Maya pipeline with [Side Ef- rector wanted. Most simulation tools aim at copying nature. We needed to do more than that. We needed to do squash and stretch, we needed new kinds of shaders, and we needed to animate water. Th ere isn't a water solution available to do what we needed to do. We had about 10 diff erent ways to do water because we had such a variety of water types. On a wide shot with a boat crashing through the water, we used particles, fl uid simulations, and pro- prietary tools for our main mesh. On a close- up, when we needed splashes, we' Why did you build proprietary tools? To make the water look like what the di- Flow. For spray, particles. For a wider mesh, we' d use Real- on top. Th e water has to change color in vari- ous locations, so we' the Caribbean, we had blue/turquoise water, in the river Th ames, dark, menacing water. We achieved the look with RenderMan shaders. d animate geometry and use water shaders d use photo reference. In d go when we need to see several hundred char- acters, it becomes more cost-eff ective to do digital. Imagine how slow it would be for one animator to move that many real characters on set, and you' Did you create any digital characters? Yes. Again, when we have wide shots and into one of them. Sometimes we had live ac- tion in front that we used for lighting refer- ence, and put digital characters next to them. d have to be careful not to knock crowds were all doing the same thing. We might have all the characters sitting down in an amphitheater and then they all stand up. Th ere was no need for AI. Th ey usually fol- lowed the lead from some real puppets. What did you use for crowd animation? We have our own tools. Normally, the Did you do cleanup on the puppets? We had thousands and thousands of mouth shapes; every character has hundreds. Another department printed mouth shapes using rapid prototyping. It was a massive undertaking, but it was a fabulous decision to go with that. Th e downside is that there was a horizontal line be- neath the eyes or in the nose area that needed to be painted out by the visual eff ects depart- ment. We wrote tools in Nuke to do this. For example, if a character rotated its head and the lighting changed across the face, we had one set of tools. If the character was static, we had simpler tools. All these tools had to work in stereo. In some shots, artists painted out the lines. At the start of the project, it was fairly painstaking, but it got faster and faster. It's such a long project, we were able to optimize how we did things. How was it for you personally to work at Are there any matte paintings in the fi lm? We used matte paintings predominately for skies, for projections on geometry, and for the back of shots. We' foregrounds. Th e island was completely geom- etry, as were the buildings at the edge of the river. With stereo, geometry just feels better. d have textured geometry in Are there any fully digital shots? Maybe a dozen shots were fully digital. Aardman? It's a studio, really, so you're closer to the script and to the director. Unlike doing vi- sual eff ects in Soho, where the post house works with the production studio, there is no middle layer. Aardman is the studio. It was fun. I've seen the fi lm now. It was fabulous to be in the audience, and see their reactions. I haven't seen one negative review in the UK press. Aardman does not make many fi lms. Being a visual eff ects supervisor on one of them was fantastic. ■ April/May 2012 37

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - April/May 2012