Computer Graphics World

April 2011

Issue link: https://digital.copcomm.com/i/30784

Contents of this Issue

Navigation

Page 30 of 51

Animation n n n n assist using Google Earth, which the artists used as reference for placing lights and so forth into their scene, giving them exact coordinates so they could match the real-world lighting exactly. “It was a synthesis of modern tech- nology that made it very easy and helped the Unreal guys. So when we brought a character into [the Unreal environment], we had a nice lighting setup that mimicked the real world,” says Krygowski. The live-action setting was augmented with CG depth passes, fog, and other atmospher- ics. In addition to the CG characters, the RBC Center was also created in 3D, as it opens up and transforms into a monster. In contrast, the group to test it. Once they were convinced it would work, they contacted Epic, which agreed to work with HOM on licensing it for this nontraditional usage. “One of our tech gurus quickly picked it up, and his enthusiasm and ability to see down the road in terms of what we could do with it assured us that we were on the right track,” recalls Krygowski. Krygowski believes that The Guardian Proj- ect short, which was done in about two and a half months, would have easily taken six months if HOM hadn’t made the choices it did, including using the game engine. “Game engines give you the ability to light and render scenes interactively in real time, for which a real-time engine is used. “We have been talking to two other clients about using real time. It is a pipeline that has legs, for sure,” he says. In fact, HOM aims to tie the Unreal Engine to the Vicon motion-capture system so clients will be able to see their recorded mocap performances integrated into game levels ren- dered in the game engine in real time. “On a project like this, you have to think down the road of potentially extrapolating characters and environments into game as- sets and a television series, and flowing the CG creative elements between mediums. By building scenes in a game engine out of the gate, our options are much broader—the need Because of the amount of animations needed for the various characters, HOM opted to use the Unreal Engine to render and light scenes in real time, which made the process more efficient. for the original project, the group is building all the environments in CG, making sure that each contains one iconic element from each locale, in addition to the respective arena. Alas, just 48 hours before HOM shipped the All-Star project, the group had to add six shots, including a new ending to the film, when some of the external resources didn’t quite blend with the rest of the show. “We had to design [those shots], comp them, ani- mate them, light them, and add effects really quickly,” says Rausch. And this is where the game engine earned MVP honors, enabling the crew to work at an unprecedented pace. Just a Game Faced with the daunting task of modeling and animating 30 main characters in a short period of time for the initial project, it was Krygowski who came up with the solution of using a game engine to render the assets. “We were in an office trying to figure out how to get all this done in a timely fashion, and hav- ing come from the game world, I blurted out, ‘I wish we could use a game engine,’ and that’s how the idea was born,” he recalls. The HOM crew quickly settled on Unreal because the development kit was free, enabling even when you’re dealing with multiple char- acters,” says Krygowski. “With the game en- gine, you reduce the time that it takes to make crucial creative decisions because you have the ability to previsualize fully rendered scenes.” As a result, the group chose to use Unreal not only for its reduced render times, but also for the dynamic options it offers for environmen- tal controls and effects. To accommodate this new approach, the team wrote several pieces of code to generate custom shaders and bring virtual cameras into and out of the Unreal Engine. This enabled the crew to render character passes in roughly 32 seconds per frame, including the motion blur pass, thus negating the need for a multi-CPU renderfarm. The entire project was rendered on two dual-processor, eight-core machines. While the crew has used game engines be- fore, those instances were to test animations in a client’s engine; this was the first time the group used one to render imagery for post-process out- put. While there were a few disadvantages—for instance, advanced deformers cannot be used on the character rigs—it was a small price to pay considering the advantages gained. And, according to Rausch, The Guardian Project will certainly not be the last project to down-res films from broadcast to game output, for example, will be mitigated, Rausch explains. And while Krygowski remained confident in this new technique, he and the others were nonetheless impressed with the results. “I’ve worked in the game world for a long time and was impressed at our ability to ramp up and get the project to the screen in the time that we had,” he says. “Using a game engine purely for a broadcast application is a new ripple in what game engines are being used for across a wide breadth of media.” The Guardian Project animated short kicked off one of hockey’s biggest events, making its debut at center ice. In addition to the film, HOM designed a virtual interactive experience for The Guardian Project booth at the NHL Fan Fair, held at the Raleigh con- vention center during the start of the all-star weekend. The experience allowed fans to don virtual-reality goggles and immerse themselves into a CG environment in which they get up close and personal with some of the newly introduced superheroes as they perform their signature moves. Thus, a legend is born…. n Karen Moltenbrey is the chief editor of April 2011 29 Computer Graphics W orld.

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - April 2011