Computer Graphics World

July-Aug-Sept 2021

Issue link: http://digital.copcomm.com/i/1399888

Contents of this Issue

Navigation

Page 8 of 67

j u ly • a u g u s t • s e p t e m b e r 2 0 2 1 c g w 7 actor's performance as a digital being. The studio showed them work it had been doing on new methods of facial capture, which led to being awarded the majority of work pertaining to Thanos and several other char- acters for Avengers: Infinity War (2018) and Endgame (2019). While the studio was getting those shows into production, some new technologies were surfacing — specifically, machine learning and deep learning — while real-time rendering techniques were becoming more robust. "Doug and I started seeing how important these developments were, not just to feature films, but also to new kinds of applications, such as home assistants," says Darren Hendler, a 20-year veteran at Digital Domain. The Doug he is referring to is Doug Roble, senior director of soware R&D, another 20-year veteran at the studio. As a result, Digital Domain formed the Digital Humans Group (DHG) near the start of 2018 with the goal of overseeing all the studio's technology in the realm of digital humans, digital creatures, and capturing actors for digital effects, commercials, games, and "new, crazy things we couldn't even imagine at that stage," says Hendler, DHG director. The group was spurred on by research papers and demos over the past few years at SIGGRAPH, and by machine learning, which Roble calls "a game changer" in the way a person can look at problems and approach them. The DHG team comprises 15 to 20 people. And even though there is a so- ware group working on a wide gamut of technologies, it does not treat every issue as a soware problem. "We look at it as soware and artistry together, and there's a constant back and forth between them," says Hendler. Creating Digital Humans At Digital Domain, the process of creating a digital human (any digital human) for the longest time had been very much the same: Build a 3D version of a person and create textures, look dev, and motion, and then create a camera, film an actor from different angles, and see how the person looks — all in the 3D realm. This is the process the studio used for Benjamin Button and Thanos. Now, however, technologies around that entire realm have evolved and spawned a new realm: the image-based, or the neural rendering, realm. "It didn't exist a year ago, and it's changing everything," Hen- dler notes. This area is tied to concepts like deepfakes. So, instead of building the character pixel-by-pixel, images or input are used to generate new images of that person or creature. The DHG does not operate in this realm per se. Rather, the group uses a hybrid ap- proach — one that combines the traditional 3D realm and the image-based realm by leveraging ray tracing from the latest Unreal Engine from Epic Games and Digital Do- main's own sophisticated neural renderer. Most of DHG's work current involves building a full three-dimensional version of a character — oen a digital double of an actor — for film or television. This entails acquiring high-res scans down to the pore level. "These days it's a pretty easy process to generate a fully-rendered still of the person from any camera angle, and not be able to distinguish the real actor from the still," says Hendler. "The tricky part happens when that person moves, talks, and emotes. There's so much subtlety involved — for ILM de-aged Joe Pesci and others to play their younger selves for The Irishman. The older Benjamin Button character was a CG human craed by Digital Domain. Digital Domain used a hybrid blending technique to age soccer star David Beckham.

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - July-Aug-Sept 2021