Post Magazine

December 2019

Issue link: https://digital.copcomm.com/i/1189824

Contents of this Issue

Navigation

Page 23 of 53

www.postmagazine.com 20 POST NOV/DEC 2019 OUTLOOK ANIMATION/VFX I n animation and visual effects, we're currently in the midst of a great transition from traditional post workflows to realtime workflows — driven by improvements to realtime rendering tools such as game engines, and through an industry-wide growing familiarity with VR. Though the need will always remain for traditional rendering tools, we've now seen Unreal Engine used in over 70 film and television productions since Disney's Rogue One in 2016, and with each project we find that more and more people across departments are realizing that they can use the engine to get quicker feedback and ultimately reach a desired creative vision more efficiently. In 2020, we expect this trend to continue and to culminate in an overall shift of creative decision-making back to the set as realtime tools enable more collaborative and interactive workflows, even when visual effects are involved. Several years ago, VR drove adoption of game engines in the motion pic- ture business, with film and TV studios needing realtime interactivity to prop- erly develop ancillary VR entertainment experiences. More recently, the consumer VR market has slowed down from its initial expectations, but a side effect from the years of VR experimentation is that studios realized that immersive headsets and game engines are actually great tools for certain aspects of film and TV production — and started to incor- porate these tools into their production workflows. At the same time, Epic has been working with leading creatives to develop new Unreal Engine features spe- cific to film and TV workflows, making the engine more user-friendly and easily tailored to these types of projects. One of the most beneficial use cases to arise from this is VR location scouting, which allows key stakeholders (such as the director, cinematographer, produc- tion designer, art director, visual effects supervisor and others) to collaborate and interactively explore a digital environ- ment and hone in on key creative choices to finalize a scene or a look ahead of pro- duction, rather than punting it to post. This type of interactive multi-user VR exploration can also be used to review any digital element, from spaceships to characters. With realtime rendering, all users can test out lighting options, traverse environments, and more on the fly, enabling more informed artistic decisions earlier in the process. All of this helps to set creative expectations ahead of time, allowing all departments to work more effectively toward a shared vision. This also allows CG and VFX teams to get involved earlier in production and offer more meaningful creative contribu- tions up front, thereby heading off some typical challenges such as compensating for poorly lit plates, or slogging through tedious iterations. Just this year, with additional hard- ware advancements and new virtual production-focused features in Unreal Engine, such as nDisplay and Live Link, productions are now able to utilize LED walls on live-action sets for even great- er immersion and collaboration, and ultimately capture accurate lighting, re- flections and visual effects all in-camera. Rather than being confined to VR, LED walls allow filmmakers to bring digital environments, characters and other elements directly to the set, facilitating a true blending of the digital and physical worlds. In this context, everyone can see and interact with the same digital elements in a meaningful way, rather than trying to light a scene or deliver lines against a green screen. The visual fidelity that can be achieved today with realtime rendering can reach final pixel quality — with lighting, reflections, digital environments and visual effects elements all captured live in-camera. Improvements to realtime technology are also impacting animated productions, as actors performing on a motion-cap- ture set can receive instant feedback on how their performances are informing the characters, and adjust accordingly. Realtime tools are also impacting all the different flavors of "vis" — including pre- vis, techvis, stuntvis, postvis and pitchvis — with game engines enabling better quality content, faster. The common thread across all of these improvements is that the visual quality keeps getting better, allowing for near-final if not final pixels at every stage. In addition to the many projects already released, we are looking forward to the premieres of a number of film and TV projects in 2020 that are utilizing Unreal Engine in new and exciting ways, and to the continued adoption of realtime workflows across projects of all shapes and sizes. With this anticipated shift in the creative center of gravity, the set is becoming freed from the limitations of green screen and transforming into a collaborative sandbox where everyone can contribute their ideas, reach creative consensus and encounter the types of "happy accidents" that can only happen with interactive trial and error. BY DAVID MORIN HEAD OF LOS ANGELES LAB EPIC GAMES, DEVELOPER OF UNREAL ENGINE CARY, NC WWW.UNREALENGINE.COM IN-CAMERA VFX: SHIFTING THE PRODUCTION PARADIGM WITH REALTIME TOOLS VR location scouting Virtual camera Lighting demo

Articles in this issue

Links on this page

Archives of this issue

view archives of Post Magazine - December 2019