Computer Graphics World

Edition 2 2018

Issue link: https://digital.copcomm.com/i/997232

Contents of this Issue

Navigation

Page 21 of 37

20 cgw | e d i t i o n 2 , 2 0 1 8 backs of buildings and anything else that wasn't visible to the camera. Character models got a poly count upgrade to work with the new animation. For the characters' upgraded heads, Epic implemented a universalized facial topology across all four characters so the facial rigs could be kept consistent and animation could be shared among characters. Each character model had about 185,000 triangles. For body rigs, Epic used Unreal Engine's Animation & Rigging Toolset (ART), a full suite of animation tools that operates as a Maya plug-in. Animation of ART rigs is easy to pop into Unreal Engine for real-time review. Facial animation posed a different chal- lenge for the Fortnite team. The final facial rig was a combination of 201 joints, lattices, and blendshapes. At the time of the Fort- nite trailer's production, FBX format was able to export up to eight influences per vertex, but the team needed more than that to get the facial performance they wanted. Alembic exports baked vertex positions, so the team used as many influ- ences and deformers as necessary within Maya before exporting the facial animation to Unreal via Alembic cache. The facial rig provided on-surface control- lers for animators to modify the facial per- formance as needed, as opposed to a virtual blendshape control board. All facial perfor- mance was manually keyframed in Maya. When an animation is exported via Alembic, the ABC files store the position of every vertex on every frame. For a model with thousands of vertex positions on each frame in a three-minute animation, the amount of data can add up very quickly. Analyzing and playing back such a large volume of data in real time isn't feasible. Instead, Unreal Engine imports and stores Alembic animation data using principal component analysis (PCA) compression. During import, the process distills the enor- mous amount of vertex-based animation data to a smaller, more manageable set of morph targets while keeping the fidelity of the original animation. During playback, Unreal Engine loads the new set of morph target data into memory and blends them per-frame in real time. In this way, the Fortnite facial animation ex- ported via Alembic format could be played back in real time in Unreal Engine. AnimDynamics, another UE feature, was also used to enhance the characters' secondary animation detail, such as hair and clothing movements, all in real time. Pulling It All Together While modeling, rigging, and animation for each asset in the Fortnite trailer was done with a DCC package, rough and final layout of each scene was done in Unreal Engine. This approach differs from a traditional CG pipeline, where the scene assembly is created within the DCC package itself. By finalizing each asset separately within a DCC package and aggregating them in Unreal Engine, the team was able to work on characters and sets in parallel. Transfer of data between DCCs and Unreal Engine was accomplished with either of two file formats: FBX, for transfer of models and editable/animatable rigs to the Unreal Engine ART, or Alembic, for transfer of complex animation such as baked morph targets. With all these files going back and forth into Unreal Engine, work being done on the latest files has to be tracked somehow to keep things efficient. In traditional pipelines, data tends be stored across multiple file servers and vari- ous locations with different naming conven- tions, which can require carefully designed structures and scripts to point to the proper file. There usually is a "home" directory of sorts, but it can be confusing. For the Fortnite trailer, Epic kept things organized by using a centralized Perforce Soware depot. By utilizing the source control mechanisms within Unreal, a user can check out and check in assets as needed for modifications, much like checking out a book from a library. This V I R T U A L R E A L I T Y ALL FACIAL PERFORMANCE WAS MANUALLY KEYFRAMED IN MAYA. REAL-TIME VOLUMETRIC RENDERING KEPT THE EFFECTS LIVE AND ADJUSTABLE IN UE.

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - Edition 2 2018