Computer Graphics World

Edition 2 2018

Issue link: https://digital.copcomm.com/i/997232

Contents of this Issue

Navigation

Page 22 of 37

e d i t i o n 2 , 2 0 1 8 | c g w 2 1 checkout process temporarily locks the asset from being changed by other users, but they are still able to access the existing file for read-only purposes. Once the user completes his/her changes, the files are uploaded to the depot through Unreal's source control tools and the UnrealGameSync (UGS) utility (a cus- tomized wrapper that controls Perforce). UGS then records and annotates all the modifications into a change list with which other users can synchronize across the entire project. Shots were composed and reviewed as level sequences within the Unreal Engine Se- quencer, a non-linear editing and animation sequencing tool. Within Sequencer, scene elements like characters, props, lighting, and special effects can be animated and layered together for review sequentially. Post Processing and Special Effects Traditional animation pipelines rely on renderfarms to calculate visual effects and post-processing enhancements as separate render passes that must be composited in a dedicated compositing program such as Adobe's Aer Effects or Foundry's Nuke. While the use of such programs has its place, Unreal Engine eliminates a number of these steps by providing those capabilities within the engine. The game's smoke-like storms and enemy deaths required special attention. In Fortnite, storms roll in as purple fog. Enemies always emerge from the storms themselves, and when killed, they evaporate into purple smoke. For the Fortnite trailer, real-time volumetric rendering methods were used to keep all the effects live and adjustable in Unreal Engine, rather than importing them from an off-line package. Some shots called for cloud effects to evoke the feel of the impending storm but didn't require the custom shapes needed by enemies and storms. For such shots, the Fortnite team made use of Unreal Engine's built-in Volumetric Fog. Storm clouds called for curling, flowing shapes. The shapes of the storms were ani- mated using 3D Flowmaps, a type of animat- ed texture in Unreal Engine. The Flowmaps were hand-painted inside of virtual reality us- ing custom Unreal Engine tools. Painting the clouds' Flowmaps in VR was a more natural experience than using a mouse or stylus. For enemy deaths, the Fortnite team used a fully real-time fluid simulation. To start off the smoke for each fluid simulation, the monsters' skeletal meshes were converted into emitters. Both the monsters' motion just prior to death and normals from the skeletal meshes contributed to the fluid's motion. The team implemented a number of clev- er tactics to improve playback speed, such as foregoing multiple light bounces on the smoke volumes and using blurring instead. Because the action is so fast, the visual difference was negligible. By using a color instead of a scalar value for the shadow density, the light hitting a volume changed color with shadow depth, giving the illusion of depth while allowing a wider range of artistic effects. Each sim was controlled via the Sequenc- er. This allowed for very rapid iterations when compared with the wait times incurred by traditional off-line rendered simulations. Where It's Going Unreal Engine gave Epic the opportunity to push boundaries by showing that film-qual- ity simulations are possible in real time. Creating all these scenes and visual effects in real time allowed the Fortnite team to achieve a quality and integration that would not have been possible with a traditional render-and-wait workflow. By using real-time rendering from start to finish on the Fortnite trailer, Epic wanted to inspire filmmakers with a new way of working. Unreal Engine releases filmmakers from the shackles of long render times, making real-time creativity and collabora- tion possible. Brian Pohl is Epic's M&E technical account manager for North America. V I R T U A L R E A L I T Y REAL-TIME PLAYBACK OF ALEMBIC FACIAL MOTION IN UNREAL ENGINE. A FORTNITE TRAILER SCENE. A MOTION-CAPTURE SESSION.

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - Edition 2 2018