Computer Graphics World

May / June 2017

Issue link: http://digital.copcomm.com/i/834195

Contents of this Issue

Navigation

Page 7 of 35

6 cgw m ay . j u n e 2 0 1 7 V I E W P O I N T A s I sat in the audience during the live Hellblade demo by Ninja Theory at GDC 2016, I was excited, even though I knew what was coming. Real-time performances are special. There is a unique kind of energy generated on stage, and as a spectator, you become part of the spectacle. Real-time VFX performances are among the most exciting uses of the medium, but it also comes with extra pressure. I felt a bit of that pressure during the GDC stage show, as I knew about the effort that was put into this demo by all the different parties, including Epic, with its Unreal Engine 4, and Cubic Motion, IKinema, and Xsens, with their real-time mocap technologies. It was a beautiful demon- stration of what is possible using real-time motion capture in a game engine for a cinemat- ic production. The real-time component, enabled by game engines, empowers animators to reinvent their pipeline. Actors can control their characters in real time, in any scene. It creates new options and significantly reduces production time. The demonstration created a stir in the community with reactions like, "Hellblade takes real-time motion capture to the next level" and "The demo takes a big step forward in crossing the 'Uncanny Valley.'" It was in- deed an amazing presentation. Epic and Ninja Theory have done an incredible job of blurring the boundaries between film and gaming production technology, demonstrating that a scene can be shot, captured, edited, and rendered to film production quality in minutes instead of days or weeks. A few years ago, this was unheard of; now we are in danger of taking it for granted. Access to real-time motion- capture technology not only saves time and money, but it also preserves the integrity of the actor's performance. It becomes much easier to capture the subtle nuances of timing, emotion, and improvisation. Faithfully re-creating an actor's performance is also crucial for an immersive experience, like in VR games. Playing VR games is all about the user believing that he or she is present in the virtual environ- ment, and being immersed only by sight and a pair of floating hands – a common occurrence in many VR projects – does not help this cause. Full-body immersion, where you can actually see your whole body and the body of your opponent, offers a richer experience that will immerse players for a longer period of time. An inertial mocap system, like Xsens' MVN, allows users to comfortably wear inertial sensors that re-create their movements and send the data wirelessly. That truly puts them "in the game." TECHNOLOGY We are frequently asked how difficult it is, and how much more technological knowledge an artist needs, to create a re- alistic character. Is it a matter of plugging together commer- cial off-the-shelf hardware and soware? Can anyone set up a live production for a concert, theme park, TV show, or theater? Or, do you need a large team and preparations of many months? A believable character like Senua in the Hellblade demo is not created in a week or two. It takes expertise at several different facets to ensure a believable virtual performance. To create a believable digital character, the VFX industry fo- cuses on giving attention to the details of creating a character, beginning with the skinning and rigging part. Once you have a character, it then needs to be brought to life with rendering techniques REAL TIME, A VIRTUAL REALITY BY HEIN BEUTE The tech, the magic, and the pressure of bringing virtual characters to life

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - May / June 2017