Computer Graphics World

January / February 2017

Issue link: https://digital.copcomm.com/i/786150

Contents of this Issue

Navigation

Page 19 of 35

18 cgw j a n u a r y . f e b r u a r y 2 0 1 7 stage, some of which are held by the actors. So, precision is vital. The data processed by the Vicon Tracker soware uses Intel Xeon and Core i7 processors. Without question, the setup – designed and devised by the crew at the RSC – is extremely complex. The group did, in fact, consider having Quartley perform offstage in a capture volume, controlling the avatar projected onstage. "It would have been technically easier, and we would have used an optical system [as opposed to the inertial system], which gives you more geographic preci- sion," explains Lumsden. "But as Greg [Dor- an] asked, how do you get that intimate relationship between the two characters, the master and the spirit, Prospero and Ariel? They have to look at each other, and the relationship has to be true and authen- tic every night. That's hard to do when one of the actors is offstage." Another alternative would have been to use pre-canned animation, but that would not have done justice to the play, either. "[The alternatives] would have been less challenging from a purely motion-cap- ture standpoint," says Lumsden. "But the creative messaging would not have been as strong." M O T I O N T E C H N O L O G Y Xsens' inertia motion-capture systems are especially well suited for live performances, since there is no need for cameras and markers, or a preconditioned area. The data can then be streamed in real time into the Epic Unreal Engine 4, as it was in this case. "It gave the RSC the chance to be creative with Ariel's costume by hiding the technol- ogy and focusing on its visual appearance," says Beute. "It also gave Mark as Ariel the flexibility to move around the stage without technical limitations." As Beute points out, motion capture is typically a one-off shoot: You plan a shoot, capture the movements, and you are done. Taking motion capture to the theater means the technology needed to be robust and reliable enough to sustain several plays, or shoots, a week. It's worth noting that despite the cut- ting-edge nature of this application, the RSC used an off-the-shelf Xsens system with the latest MVN Studio soware. While Lumsden cites the overall project as the biggest challenge, he is quick to point out that the real-time facial-capture technology for this application proved especially difficult. "We created our own real-time setup for the show, and it was really challenging to get that working. It's proprietary soware we wrote, but now it will be part of future projects we do at the Imaginarium," he says. While there are some commercial options available, those are closed systems. The Imaginarium's Grip platform – which cal- ibrates, tracks, and retargets facial move- ment – works in real time and is scalable, making it ideal for this situation. The setup entails a head-mounted cam- era (HMC), which in this case is tethered, meaning there is a cable at the back through which the video stream is sent, rather than through wireless transmission. The HMC also illuminates Quartley's face, as the capture environment is fairly dark and the group wanted the audience to be able to understand what they're seeing, so IR wasn't considered, Lumsden notes. "We ingest the video data at 720p50 and feed it into our real-time facial tracker," he says. "A nice feature here is that we don't need to apply any special makeup for this step (as Quartley is wearing his Ariel stage makeup)." As Lumsden explains, the tracker had been trained on images of Quartley captured in several sessions over the last year, so it's well tuned to variations in the image that occur between shows. The Imaginarium annotated those training images with the position of the actor's facial features, such as specific points on the lips, jaw, nostrils, eyebrows, eyelids, and pupils. The crew com- pensates for small variations in the point of view of the helmet by identifying stabilizing points at the temples, on the bridge of the nose, and where the nose meets the lip – areas that move just slightly. The rest of the tracked points are interpreted as animation of the face. Those animations are retargeted onto the Harpy rig controls and streamed to the rendering engine (UE4), where they are combined with the body controls, rendered, and then projected. "We wrote a blueprint in UE4 that re- ceives the animation data over the network and writes it onto the rig," Lumsden says. (TOP) ACTOR QUARTLEY WEARING A HEAD-MOUNTED CAMERA AND MOCAP SUIT. (BOTTOM) ONE OF MANY CG VERSIONS OF ARIEL CREATED BY THE IMAGINARIUM FOR "THE TEMPEST."

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - January / February 2017