Computer Graphics World

JANUARY/FEBRUARY 2011

Issue link: https://digital.copcomm.com/i/26447

Contents of this Issue

Navigation

Page 21 of 51

n n n n Technology Mark Sagar, at left, received a Scientific and Engineering Award for his early and continuing work on retargeting motion data onto digital char- acters. A system he developed with a team at Weta Digital helped Avatar director Jim Cameron see retargeted expressions on Na’vi in real time. premiered at SIG- GRAPH in 1999. And, he had collabo- rated with Debevec to light his faces, an alliance that resulted in the Light Stage technology. “Sony Pictures Imageworks hired me to apply the Light Stage imaging and rendering system to Doc Ock in Spider-Man 2,” Sagar says. “But while I was there, I looked at what they were doing with motion capture and proposed us- ing a FACS system instead.” And that, together with his later work at Weta Digital, would even- tually result in this year’s SciTech award. Sagar had first seen research on using the FACS coding system with data from a tracked face while at MIT in the mid-’90s. “Te meth- ods weren’t applicable to film production, but I kept it in the back of my mind,” he says. And in mid-2002, he built a system that au- tomatically calculated FACS values for expres- sions from motion-captured data; it analyzed the motion-capture points. He also created a system for calibrating actors; that is, a set of expressions based on the FACS coding system and ones he added for dialog. “In the early 2000s, motion capture was mainly done by using the points to deform a patch of skin directly,” Sagar says. “Te prob- lem is that it introduced noise, it was difficult to see what was going on, and it was especially difficult to edit. If you have 100 points moving around on a face and you want to change an expression, there’s no way to do that because the points are tied to the skin. For example, if you want to open the jaw, because the points are tied to the skin, there’s no simple way to control that.” At LifeFX, Sagar and his team had created photorealistic faces by capturing dense mo- 20 January/February 2011 tion data they retargeted to digital characters. “It was easy to move the data onto a digital double,” he says, “but when you try to map the data onto a face that’s fundamentally dif- ferent, the problems start coming. You can warp the data, which is what people were do- ing for some solutions. But, say you have an eyebrow. If you move it up on a human, the skin moves in a straight line. On a gorilla, it moves around the bone in a circular motion. Tere’s no easy way to change that linear mo- tion to a circular motion mathematically and get the scale right. So I wondered, What’s the actual information? What’s the fundamental information that the face gives us? How can we break this down into something that’s uni- versal? And that’s when I started experiment- ing with FACS. FACS is really an alphabet of facial expressions. It’s the simplest way to represent what’s going on in the face.” At Sony Pictures Imageworks, Sagar con- vinced Damian Gordon, the motion-capture supervisor, to let him do a test for Monster House. “Te text was successful,” he says. “Tey put it into production on Monster House.” Ten a year later, the New Zealand native needed to go home for family reasons. As luck would have it, though, Weta Digital had just started working on King Kong. “Tey were going to hand-animate King Kong,” he says. “But Joe Letteri [senior visual effects supervi- sor] was open to new ideas, and I got together with Andy Serkis, the actor who played King Kong. We had to do about 40,000 frames of King Kong animation to convince Peter Jack- son, but we changed the way King Kong was going to be animated.” Te facial animation system Sagar has developed is flexible: It can work with data captured from any motion-capture system– points, Mova data, video images, and so forth. It calculates the FACS expressions and maps them onto a digital character using blend- shapes, muscle models, joints, whatever. “I have a way to map the FACS data onto whatever animation controls the animators use,” he says. “I had to come up with special ways of repre- senting the information in order to calculate it, certain math tricks. But, it computes the data for them. And the good thing is that it allows animation and motion capture to be mixed, so animators have the best of both worlds.” Once the idea of using FACS data entered the atmosphere, other studios began following suit. Meanwhile, for Avatar, Sagar and a crew at Weta Digital created a real-time system. “James Cameron could look through a virtual camera and see the Na’vi expressing and look- ing around live. We mixed that together with the body motion capture so you could see the characters performing in the environment.” Because Cameron wanted the actors to wear helmet cameras rather than install cameras around the stage, Sagar used real-time comput- er vision techniques to track the face in the 2D images and compute the FACS expressions. “Te good thing is that the face is a constrained system, so it works, even though the points are moving on a plane,” he says. “Te system rec- ognizes that someone is pursing their lips, so on the 3D model, it pushes the lips forward.” Now, Sagar is moving deeper. He is work- ing with a team on a full biomechanical sim- ulation of the face. “I want to automate how a face is built in a physically realistic way,” he says. To do this, Sagar is collaborating with re- searchers at the Auckland Bioengineering Institute, and reading. “I’ve read 100 plastic surgery journals,” he says. “I’ve been to dissec- tions. I’ve had my head MRI’d.” And, that means he’s now come full circle. “My background is in bioengineering,” Sagar says. “It’s fun to return to exploring the physi- cal basis for all this.” Global Illumination Technical Achievement Award (Academy Certifi- cate) to Eric Tabellion and Arnauld Lamorlette for the creation of a computer graphics bounce- lighting methodology that is practical at feature- film scale. Arnauld Lamorlette and Eric Tabellion proved they could use global illumination to create photorealistic images in test shots for Te Matrix, but PDI/DreamWorks, which then was still doing postproduction visual ef- fects work, did not end up working on that

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - JANUARY/FEBRUARY 2011