Post Magazine

July 2017

Issue link: https://digital.copcomm.com/i/851458

Contents of this Issue

Navigation

Page 17 of 43

www.postmagazine.com 16 POST JULY 2017 "It was decided early on that it was going to be a cloaked plane," says Bialek of the design. "The plane has a cloaking device that makes it invisible. [That] in itself has a lot of intrinsic chal- lenges. How do you make it interesting, where the main object that the characters are fighting on is supposed to be invisible? How do you make that compelling? "They didn't have a buy off on the design and how the lighting would work on this cloaked plane. Well, how do we shoot it? Do we shoot it practical and put a bunch of practical LED lights under the character? We don't want to be locked in to something. We knew shooting it practically was going to be a real challenge, so we opted to do all-CG." In addition to the characters and the plane itself, the scene also incorporates CG clouds and sky. The ultimate crash on the beach is a combina- tion of elements. "Once they crash on the beach, you are deal- ing with more of a hybrid situation," he explains. The scene was shot on a sound stage in Georgia — what Bialek describes as a "big sandbox," with gas lines, practical fire and plane wreckage strewn about. "We [created] set extensions on top of that and layered on a ton of effects in between them to add some connective tissue between what is CG and what is live action." — By Marc Loftus MPC'S FACE TRACKING IN THE MUMMY LOS ANGELES — Moving Picture Company (MPC) (moving-picture.com/film) will be presenting a talk at this year's SIGGRAPH show in Los Angeles on "Optical Flow-based Face Tracking in The Mummy." According to Butler, "what we are presenting at SIGGRAPH is, a little bit of software work we did to use Nuke's smart vector tracking capability to make it part of our facial tracking system. On previous shows, where an artist would have to sit and use a facial rig to track facial movements in a very pain- staking way, instead on this show, we did what we call 'Skull Track,' where we track the movement of the skull itself and then most of the time we're able to have Nuke do the heavy lifting of facial movements to be able to track digital applications. We're able to get Nuke to give us a 3D version of a face to go straight into the render pipeline. It was 2D-based tracking technology but it was able to convert to 3D and become part of our pipeline, and that saved us so much work. It was the first show where we had really moved away from manual methods and avoided having to go into true facial motion capture. We found a good hybrid solution that had enough tracking accuracy just working off of the film plates themselves." MPC's Curtis Andrus will present the topic to show attendees on Thursday, August 3rd at 10:45am in room 402AB of the Los Angeles Convention Center. MPC employed Nuke in its facial-tracking pipeline. Final results were achieved using a hybrid approach. Tom Holland plays Spider-Man. His costume evolves throughout the film.

Articles in this issue

Links on this page

Archives of this issue

view archives of Post Magazine - July 2017