Computer Graphics World

July-Aug-Sept 2021

Issue link: http://digital.copcomm.com/i/1399888

Contents of this Issue

Navigation

Page 11 of 67

10 cgw j u ly • a u g u s t • s e p t e m b e r 2 0 2 1 thanks in large part to machine learning, which has enabled artists and researchers to tackle some very challenging problems, especially in terms of realism. Machine learning, says Nichols, approaches that problem analytically, not emotionally, "and it is making a huge difference." "For something to look really humanlike, a good dataset is imperative, and machine learning is going to help significantly in that regard. The Wikihuman Project initiated that, and now, tools such as MetaHuman Creator are exploring it even more (see "MetaHuman Creator," this page). Technological Improvements Nichols agrees with Hendler in that machine learning provides the important human subtleties that had been missing from CG humans and thus contributing to the uncanny look of the characters. "There's a lot of things that go on in our faces that we don't necessarily see, but machine learning can 'see' and 'learn' those things, and apply them to the model," says Nichols. "We tend to only see the bigger motions, not the sub- tle lip or eyelid movement, for instance." As a result, machine learning has accel- erated progress in this area by leaps and bounds, eliminating the emotional factor of interpreting (or rather, misinterpreting) the face, as well as greatly speeding all the trial-and-error during the creation process. Before, a lot of technology was needed to capture a human face — a lot of dots on the face — and lots of information. Today, much can be done with a mobile phone. "Even capturing technology is changing," Nichols points out. Whereas once a full motion-capture stage was needed, now a lot can be done with an inexpensive Rokoko mocap system, he adds. While there are some impressive devel- opments and applications for craing realis- tic real-time autonomous digital humans, Nichols notes that a crucial element is still missing: full ray tracing. "We've done this a million times in the visual effects world. You can't really get a human to look completely real without ray tracing. And that's the thing that still hasn't gotten to the real-time digital human world just yet," says Nichols. "It's really hard to do substantive scat- tering, hardcore shading, right now in real time," says Nichols. "It's going to get there, it always does, but it's a real challenge. MetaHuman Creator Creating realistic-looking digital humans for use in a real-time engine used to be a major feat. However, Epic Games is making it easier with its MetaHuman Creator tool, a new, free browser-based app for building fully-rigged digital humans, complete with hair and clothing, in less than an hour, for direct use in Epic's Unreal Engine. In early access, Meta- Human Creator runs in the cloud via Unreal Engine Pixel Streaming. "Up until now, one of the most arduous tasks in 3D content creation has been constructing truly convincing digital humans. Even the most experienced artists require significant amounts of time, effort, and equipment, just for one character," says Vladimir Mastilovic, VP—Digital Humans Technology at Epic Games. "Aer decades of research and development, and thanks to bringing companies like 3Lateral, Cubic Motion, and Quixel into the Epic family, that barrier is being erased." MetaHuman Creator enables users to easily generate new characters through in- tuitive workflows that let them sculpt and cra results as desired. As adjustments are made, MetaHuman Creator blends between actual examples in the library in a plausible, data-constrained way. Users can choose a starting point by selecting a num- ber of preset faces to contribute to their human from the range of samples available in the database. Users are able to apply a variety of hair styles that use Unreal Engine's strand-based hair system, or hair cards for lower-end platforms. There's also a set of example clothing to choose from, as well 18 differently proportioned body types. When ready, users can download the asset via Quixel Bridge, fully rigged and ready for animation and mo- tion capture in Unreal Engine, complete with LODs. Users will also get the source data in the form of an Autodesk Maya file, including meshes, skeleton, facial rig, animation con- trols, and materials. These digital humans can run in real time on high-end PCs with an Nvidia RTX graphics card, even at their highest quality with strand- based hair and ray tracing enabled. MetaHuman Creator has made it easier to generate realistic human models.

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - July-Aug-Sept 2021