Post Magazine

November 2012

Issue link: https://digital.copcomm.com/i/94072

Contents of this Issue

Navigation

Page 22 of 51

As the show expanded from 130 shots to 400 shots over the course of a couple months, the cloud enabled us to actually grow with the show without incurring insane amounts of cost in doing so." Virtually the entire movie was rendered in a cloud-based system developed by Boston- based Zync. "They have a really great system for both managing how data gets into and out of the cloud, as well as bringing up virtual computers in the cloud and managing them as they churn away on your data and then shutting down once they're done." What Zync's cloud solution allowed Atomic to do is treat computing like a gas or electric utility at your house; you pay only for what you use. They were able to scale up and down with not just the needs of a show, but with the needs of the crew on an hour-by-hour basis. That made their costs very predictable because every shot had its own rendering cost associated with it. "You're paying for what you use on an hourly basis, it actually costs the same to render on 100 computers for an hour as it would to rent on 10 computers for 10 hours," explains Baillie. Artist time is expensive, so if a studio can render shots fast as they are building them, that's a huge cost savings. "They're going to be able to react and keep that creative momen- tum going," he says. "It would have been impossible for us to actually keep up with the growth of this show any other way. I can go from having 50 com- puters running in the cloud to having 400 computers running in the cloud within a couple of minutes." With an in-house renderfarm, "there would be no way I could turn on a dime and add another 100 computers within a week even, or two weeks, or three weeks. Building out data centers takes a long time, and a lot of money. Whereas when you're using the cloud you can react to changes like that instantly. That is a hugely liberating concept both as a business owner and as an artist." Baillie hopes that both the movie studios and the artists creating the software are going to get onboard with this cloud idea and embrace it. "When they see that a movie that is a top-caliber film like Flight can be success- fully made with the cloud, and made even better as a result of the cloud, I hope it's going to spur these people on to stop being scared of the concept and to start embracing it." For 3D, animation, lighting and rendering, they used Maya and V-ray as the primary ren- derer. They also used 3DS Max for matte painting work and some effects work. "3DS Max has some great tools, some great plug-ins like FumeFX and Krakatoa available to it. For compositing, we're entirely Nuke." "We're also big fans of Shotgun," notes Bail- lie. "We use Shotgun to do all our production management. In keeping with having a small, nimble, fast-moving team on the show, we basi- cally had a production team of five people and that includes editorial managing all 400 shots over the course of four months. That was only possible due to Shotgun and our internal asset management system called Fidget." They use mostly Macs running OSX. Their core server infrastructure runs on Linux machines built by Lightbeam. The matte paint- ers and some of the VFX artists work in 3DS Max on Windows. THE SHOOT, DAILIES Backing it up a little, let's talk about produc- tion. Flight was shot with Red Epics, and Light Iron Los Angeles handled the entire on-set data workflow and digital dailies production with their Outpost systems. Light Iron also did the final DI and DCP master. "The Red Epic was actually a really great camera for us to use in a lot of ways because of its size and weight," declares Baillie. When they were shooting the sky back- ground helicopter plates, they had to cover themselves for any situation. Burgess and Bail- lie devised this custom-made rig for the front of the helicopter where they mounted three Red Epics. Each one of them had a 14mm Ultra Prime lens on it. They were fanned out so that between those three cameras they were able to get a 240-degree stitchable pan- orama of everything the helicopter saw. Instead of doing the same action three times with the camera pointed three different ways, they were able to get any action all in one pass. "We ended up getting enough heli- copter footage to fulfill the needs of 80 per- cent of the shots that we had to achieve." Flight visual effects were done at 3K and final deliverables rendered at 2K. "We actually found that little bit of extra resolution helped make a lot of our greenscreen extractions a lot easier, " reports Baillie. Baillie says one challenging aspect of the Atomic used Autodesk's Maya and 3DS Max, along with V-ray, FumeFX, Krakatoa and Nuke. film was "having worked with Robert Zem- eckis in the motion capture all-CG world and then taking the language that we developed there and applying it back to live-action film- making. A lot of live-action filmmaking has more restrictions to it than there would be in a completely digital world. A lot of the ideas that Zemeckis had were still in that fantastic realm. Trying to figure out how to achieve these amazing, big ideas in the timeframe that we had was probably one of the biggest cre- ative challenges." www.postmagazine.com Post฀•฀November฀2012฀ 21 Atomic Fiction was able to create so many VFX in a such a short time thanks to Zync's cloud rendering solution.

Articles in this issue

Links on this page

Archives of this issue

view archives of Post Magazine - November 2012