Computer Graphics World

January/February 2015

Issue link: https://digital.copcomm.com/i/457057

Contents of this Issue

Navigation

Page 15 of 51

14 cgw j a n u a r y . f e b r u a r y 2 0 1 5 lighting. Big scenes are Manuka's backbone. We have to create so many populated environments and worlds, we designed it with that in mind." Big scenes also mean big data, particularly when filmmak- ers use virtual production rather than live-action photography. N E W T E C H : P I P E L I N E "We had been developing the pipeline for the next Avatar, but we decided to roll it out for this project," Saindon says. "It's a different way to organize our assets and keep track of them. We had more CG shots and more assets and characters to look aer than ever before." Letteri chronicles the difficulties inherent in tracking assets for virtual production by comparing that method of filmmaking with live action. "In a live-action shoot, there are costumes, lighting, and so forth, but in the end all that's boiled down to a piece of film," Letteri says. "That's what we start from. We have to know what's in the plate, and we can measure everything in the plate, but we don't have to worry about how it got there. We don't have to track the costumes and so forth while they're shooting. But when we're doing virtual production, all that becomes our problem and part of our pipeline. What costume the character was wearing and whether the director changed the costume from April to May, for example. We have to track history back and forth. We tracked all those complexities for Avatar mostly by hand." The new pipeline manages the assets for the artists, but change, even welcome change, doesn't always come easily. "Our underlying pipeline was almost 100 percent different from the new pipeline," Saindon says. "The way we send things to the render wall with new queuing soware, the way we see assets, and our lighting tools and renderer are all different. The first month was difficult, really rough, shocking. No one knew where to get anything. But I'm not sure we would have finished this movie without it. Peter was still editing and writ- ing until August. We had a lot to do in a short time." In fact, when Jackson filmed on location for Five Armies two years ago, the battles were not planned. "Peter wanted to go onto the motion-capture stage, load in a scene, and film the shots the way he wanted," Saindon says. Before the new tools, that would have been difficult. The scenes were too large, the amount of information too much, taking motion-capture data through motion editing, lighting, and texturing, and put- ting it back onto the stage for Jackson to film was a problem. The new pipeline made it pos- sible. "We didn't get greenscreen shots for the battle," says Sain- don. "A lot of it was developed on the motion-capture stage with Peter, Terry Notary [stunt performer, movement choreog- rapher], and Glenn Bos well [stunt coordinator]. Then Peter would shoot it with his mocap cameras." N E X T- G E N V I R T U A L P R O D U C T I O N "We can open a scene on a motion-capture stage that anyone else would be working on at the same time. I can load the same scene in Nuke, Maya, and MotionBuilder, have the same data, see everything going on, and do all kinds of conversions," Saindon says. "And, it was very lightweight." While he was on the motion- capture stage, Jackson could see thousands of CG characters running around and fighting through his virtual camera. "He could film the battle as it happened," Saindon says. "I think he loved it. We have to ac- count for more elements when a director can put the camera anywhere. So that means a lot of work in animation. But the advantage is, that once the shot gets past a director, it's almost ready. We can almost go directly into shots and get through the pipeline quickly." Because all the soware works within the same pipeline and because Gazebo is so fast, ani- mators can drop a lighting setup into a scene for a presentation. "We can see final lighting on an animation scene as it's going along, and we can send Peter our lighting setup in animation when it's being blocked out," Saindon says. "Because light- ing done by the TDs is dropped into an animation scene, and because our puppets are so high res now, Peter couldn't tell if he was looking at the scene for animation or lighting." O R C H E S T R A T I N G T H E B A T T L E To give Peter the most freedom possible, Gilman's team met early with the director to map out critical moments in the bat- tles – which Gilman describes as any fighting with more than 30 characters. "We identified moments when the tide turns, moments of conflict, moments of tran- sition to determine when we would need the largest amount of motion capture," Gilman says. They also defined how each race of combatants would march and charge. "Instead of having each race march or charge the same, they needed to move in fundamen- tally different ways," Gilman says. "One of the first things I learned as a creature animator is that a performance with monotone timing is boring. So, we thought of each race with almost musical concepts. Orcs do formations in beats of five. When the dwarves march, they always march off- step relative to the rank behind them, so we get oscillation be- tween their shoulders. And the elves move in beats of three." Notary and Boswell spent six weeks shooting 15 motion-cap- ture performers to provide animators with enough data. "You could hear on stage when Terry [Notary] was choreo- graphing, he'd call out beats of five for the Orcs, or play music," Gilman says. The elves, which moved to a waltzing rhythm of threes, performed in threes. "We'd choreograph two elf swords- men protecting a single archer," It ISN'T UNTIL YOU PLACE A CAMERA INSIDE A GROUP OF 3,000 GUYS TRYING TO KILL EACH OTHER THAT YOU GET VISCERAL, ORGANIC PERFORMANCES.

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - January/February 2015