Computer Graphics World

January / February 2017

Issue link: https://digital.copcomm.com/i/786150

Contents of this Issue

Navigation

Page 20 of 35

j a n u a r y . f e b r u a r y 2 0 1 7 c g w 1 9 The retargeting is learned from hand- tuned expressions that are mapped to facial expressions that Quartley provided in a facial range-of-motion (FACS ROM) training set. "From there, we can make corrections to keyframes to improve the result," says Lumsden. As Lumsden points out, the group does not use timecode synchronization for this setup. All of the systems – face, body, render, projection – are running in real time, at approximately 50 frames per second. "Of course, there's latency, and more than I'd like, but the objective here is to be as close to live as possible," he says. "Normally I'd like to run the face at a higher frame rate, but we're limited by the choice of hardware suited to the HMC form factor." The face is primarily a blendshape rig, making it easier to be certain of uniformity across the various platforms where it exists (Maya, Unreal, GRIP). Going forward, the Imaginarium is looking at Fabric Soware's Fabric Engine as a potential solution to that. "The animation isn't going to be compa- rable to a Hollywood blockbuster, where it's been perfected by animators in postproduc- tion," says Lumsden. "It's enough for now to be opening the door of this new world of real-time performance capture, and I'm thrilled that Intel and the RSC have been bold enough to want this, and very proud of the work that our team at the Imaginarium has put together to deliver it." M O V I N G F O R W A R D According to Lumsden, a mandate at the Imaginarium from Serkis – who comes from a theater background – is to make perfor- mance capture as prevalent as possible within different forms of media. "He loves the idea of pushing the boundaries," he adds. Theater is now entering an era where characters and scenes can be presented in ways that are more visually engaging, and, in many cases, far beyond what even the authors originally imagined. "Inertial motion capture is changing how far productions can push their cra, bringing high-end digital characters into live shows," says Beute. "With 'The Tempest,' the RSC is creating a real-time application that is both immediate and novel, something audiences always want to see on their nights out." Indeed, performance capture has been a game changer in film, but doing a live performance every night in real time on a theater stage presented a unique challenge, one that has been met with resounding success. "In films, we are used to highly realistic visual effects. The ability to create visual effects in real time brings a lot of freedom to any application. Being up close to the stage and performing live are big challeng- es for bringing these visual effects to the theater, and until now, they have relied on visual tricks, like onstage magicians do," says Beute. "Adding our technology brings live, realistic visual effects a step closer to the theater." Imogen Moorhouse, CEO of Vicon, adds: "Combining one of Shakespeare's most renowned plays with innovative, augment- ed reality driven by Vicon [and others'] technology, signals a change in theater production and traditional media. It's im- portant to continue to innovate in motion capture, and the Royal Shakespeare Com- pany has demonstrated just how versatile the technology is when used to ignite the imagination of theater-goers old and new." Yet, it's not just motion capture and facial capture that are opening new doors. Advances in real-time technology are making an impact across entertainment, as well. "There is some crazy stuff going on in real time," says Lumsden, as he points to Epic Unreal Engine's Sequencer Editor, a multi-track editor for creating and preview- ing cinematic sequences in real time, as a particular technology to watch. In addition, many inside and outside of the gaming world are closely watching the progress of Ninja Theory's work on the upcoming video game Hellblade: Senua's Sacrifice, which utilizes real-time cinematography to cap- ture the nuances of the digital character's facial expressions, the lighting, and visual effects, then renders it as 3D data directly within UE4's Sequencr. No matter if it's film, gaming, theater, or other genres, there will always be those who think outside the box, even for a production that is centuries old and has been per- formed countless times. Two years ago when the RSC was searching for a technology that would not only dazzle audiences but also enhance the theater experience, the group had asked themselves what Shakespeare would be using if he were alive today. The answer, they agreed, was that he would be looking at all the cutting-edge tools and technol- ogy available. And that is exactly what this group decided to do, and as a result, gave the performance of a lifetime. ¡ REAL-TIME FACIAL AND BODY CAPTURE PERFORMED ON STAGE BY ACTOR QUARTLEY DROVE THE ANIMATION OF THIS 3D HARPY. Karen Moltenbrey is the chief editor of Computer Graphics World.

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - January / February 2017