Post Magazine

March 2017

Issue link: https://digital.copcomm.com/i/799350

Contents of this Issue

Navigation

Page 29 of 43

www.postmagazine.com 28 POST MARCH 2017 (Top) Actor Quartley wearing a head-mounted camera and mocap suit. (Bottom) One of many CG versions of Ariel created by the Imaginarium for The Tempest. able, those are closed systems. The Imaginarium's Grip platform — which calibrates, tracks and retar- gets facial movement — works in real time and is scalable, making it ideal for this situation. The setup entails a head-mounted camera (HMC), which in this case is tethered, meaning there is a cable at the back through which the video stream is sent, rather than through wireless transmission. The HMC also illuminates Quartley's face, as the capture environment is fairly dark and the group wanted the audience to be able to understand what they're seeing, so IR wasn't considered, Lumsden notes. "We ingest the video data at 720p50 and feed it into our realtime facial tracker," he says. "A nice feature here is that we don't need to apply any spe- cial makeup for this step (as Quartley is wearing his Ariel stage makeup)." As Lumsden explains, the tracker had been trained on images of Quartley captured in sever- al sessions over the last year, so it's well tuned to variations in the image that occur between shows. The Imaginarium annotated those training images with the position of the actor's facial features, such as specific points on the lips, jaw, nostrils, eyebrows, eyelids and pupils. The crew compensates for small variations in the point of view of the helmet by identifying stabilizing points at the temples, on the bridge of the nose,and where the nose meets the lip — areas that move just slightly. The rest of the tracked points are interpreted as animation of the face. Those animations are retargeted onto the Harpy rig controls and streamed to the rendering engine (UE4), where they are combined with the body controls, rendered and then projected. "We wrote a blueprint in UE4 that receives the animation data over the network and writes it onto the rig," Lumsden says. The retargeting is learned from hand-tuned expressions that are mapped to facial expressions that Quartley provided in a facial range-of-motion (FACS ROM) training set. "From there, we can make corrections to keyframes to improve the result," says Lumsden. As Lumsden points out, the group does not use timecode synchronization for this setup. All of the systems — face, body, render, projection — are running in realtime, at approximately 50 frames per second. "Of course, there's latency, and more than I'd like, but the objective here is to be as close to live as possible," he says. "Normally I'd like to run the face at a higher frame rate, but we're limited by the choice of hardware suited to the HMC form factor." The face is primarily a blendshape rig, making it easier to be certain of uniformity across the various platforms where it exists (Maya, Unreal, GRIP). Going forward, the Imaginarium is looking at Fabric Software's Fabric Engine as a potential solution to that. "The animation isn't going to be comparable to a Hollywood blockbuster, where it's been perfected by animators in post production," says Lumsden. "It's enough for now to be opening the door of this new world of realtime performance capture, and I'm thrilled that Intel and the RSC have been bold enough to want this, and very proud of the work that our team at the Imaginarium has put together to deliver it." MOVING FORWARD According to Lumsden, a mandate at the Imaginarium from Serkis — who comes from a theater background — is to make performance capture as prevalent as possible within different forms of media. "He loves the idea of pushing the boundaries," he adds. Theater is now entering an era where characters and scenes can be presented in ways that are more visually engaging, and, in many cases, far beyond what even the authors originally imagined. "Inertial motion capture is changing how far pro- ductions can push their craft, bringing high-end digital characters into live shows," says Beute. "With The Tempest, the RSC is creating a real-time application that is both immediate and novel, something audiences always want to see on their nights out." Performance capture has been a game changer in film, but doing a live performance every night in real- time on a theater stage presented a unique challenge, one that has been met with resounding success. "In films, we are used to highly realistic visual effects. The ability to create visual effects in real time brings a lot of freedom to any application. Being up close to the stage and performing live are big challenges for bring- ing these visual effects to the theater, and until now, they have relied on visual tricks, like onstage magicians do," says Beute. "Adding our technology brings live, realistic visual effects a step closer to the theater." Imogen Moorhouse, CEO of Vicon, adds: "Com- bining one of Shakespeare's most renowned plays with innovative, augmented reality driven by Vicon [and others'] technology, signals a change in theater production and traditional media. It's important to continue to innovate in motion capture, and the Roy- al Shakespeare Company has demonstrated just how versatile the technology is when used to ignite the imagination of theater-goers old and new." It's not just motion capture and facial capture that are opening new doors. Advances in realtime technolo- gy are making an impact across entertainment, too. "There is some crazy stuff going on in realtime," says Lumsden, as he points to Epic Unreal Engine's Sequencer Editor, a multi-track editor for creating and previewing cinematic sequences in realtime, as a technology to watch. Many inside and outside of the gaming world are also closely watching the progress of Ninja Theory's work on the upcoming video game Hellblade: Senua's Sacrifice, which uses realtime cine- matography to capture the nuances of the digital char- acter's expressions, the lighting and visual effects, then renders it as 3D data directly within UE4's Sequencr. No matter if it's film, gaming, theater, or other genres, there will always be those who think outside the box, even for a production that is centuries old and has been performed countless times. Two years ago when the RSC was searching for a technology that would not only dazzle audiences but also enhance the theater experience, the group had asked themselves what Shakespeare would be using if he were alive today. The answer, they agreed, was that he would be looking at all the cutting-edge tools and technology available. And that is exactly what this group decided to do, and as a result, gave the perfor- mance of a lifetime.

Articles in this issue

Links on this page

Archives of this issue

view archives of Post Magazine - March 2017