Post Magazine

July 2010

Issue link: http://digital.copcomm.com/i/13405

Contents of this Issue

Navigation

Page 28 of 51

Clink’s visual effects artists work off a LaCie 12big Rack Network, a customizable, network-attached and block storage system. CLINK When Funnelbox opened its high-end visual effects and motion graphics di- vision, Clink (www.clink.com), about nine months ago, meeting storage needs was a priority.“At Funnelbox we had tried sharing media with FireWire drives or Ethernet, and they were too slow,” says Crocker.“So we initially went with an Apple Xsan with four Xserve RAIDs for 18TB of storage for editorial. But as we grew, we noticed that when one machine was rendering it put a perfor- mance hit on the whole system.We really needed to upgrade and moved to LaCie’s 12big Rack Network,” a customizable, network-attached and block storage system. Funnelbox and Clink installed two 12big units that provide 24TB of storage. “The system was fairly new, but we liked its speed and redundancy and it had cool management tools,” Crocker reports.“We kept one of the Xserve RAIDs for immediate archiving and sold the rest.” LaCie’s 12bigs “show up on every worktop as a local drive, so everyone has the same file structure in the drive.We organize projects by job number with files within that: After Effects files, Photoshop files, Final Cut files, renders. Every- thing stays together, and everyone knows where everything is or should be,” Crocker explains.“And there’s no file lag time; everything is instantaneous.” In fact, when McKay Marshall came on board he “didn’t realize that artists weren’t working off local storage,” he recalls. “The 12bigs are unlike any net- work I’ve seen.They’ve met and exceeded our storage and speed needs — they’ve been a dream.” Clink does commercials,Web videos and point of sale videos for high-profile marketers such as Nike,Wacom, Adidas and Capital One.A two-minute launch video for Wacom’s flagship Cintiq 21UX found multiple Clink artists creating Au- todesk Maya 3D sequences, rendering Adobe After Effects and Photoshop files, manipulating live action it acquired and compositing the works in one final shot. “The pipeline was vast,” Marshall notes.“There were times when two or three artists were accessing the same stuff,and we didn’t see any kind of hit on the sys- tem. It was just seamless.” Crocker expects the 12bigs to continue to meet storage needs for quite some time.“They’re scalable so we can get a bigger hard drive and rebuild the SAN, or if we get another workstation and need more bandwidth and throughput we can add another Fibre Channel expansion module for a bigger pipeline,” he says. Speed will be key as Clink begins to do productions in 4K, he notes.The com- pany just finished creating 4K content to play via BrightSign digital signage con- trollers at Infocomm.“Even 4K is starting to get trumped by higher resolutions,” notes Marshall. “As pictures get bigger, with more resolution and denser pixel counts the hardware that can keep up with that will be the hardware that wins.” CINESITE In London, Cinesite (www.cinesite.com) has been busier than ever, creating vi- sual effects for Marmaduke and counting among work in progress Harry Potter & The Deathly Hallows: Part 1,The Chronicles of Narnia:The Voyage of the Dawn Treader, John Carter of Mars and Battle: Los Angeles. “We’ve been asked to do more VFX shots on more shows at the same time,” notes technical services manager Peter Robertshaw.“We’re starting to see more demands for stereoscopic images, and we’re doing a lot of fluid dynamics simula- tions.”All put pressure on the company’s existing clustered storage solution. So last fall Cinesite invited several vendors to bring their storage products on- site for testing with non-production data. Based on performance, features and price, a pair of BlueArc Mercury 100 heads with about 100TB of disk storage were selected. Cinesite bought another 30TB of storage almost immediately and has since acquired two more Mercury 100s and another 70TB. “We were looking for a scalable system that we could grow with confidence,” Robertshaw says.“One of the issues we had with our previous system was get- ting directory-level data out of it in a timely fashion.We went out of our way to test for this during our selection process, and BlueArc was able to get informa- tion out in a much more responsive way.” He likes the ability of the BlueArc system to use both SAS and SATA disks and operate two tiers of storage.“We put different demands on the different tiers,” he explains.“The hot file that we access a lot, we put on tier one with a high-performance SAS disk. Other randomly-accessed files, such as raw scans that are fed sequentially into the pipeline,we put on tier two with the SATA disk. Targeting tiers of storage to usage patterns is a smart way to run the system.” Cinesite typically handles up to eight projects at any one time with more than 350 effects artists concurrently accessing thousands of files with sizes ranging from a few kilobytes to hundreds of gigabytes, frequently in unpre- dictable sequential and random patterns, placing enormous demands on throughput and I/O performance. Robertshaw gives kudos to BlueArc support and to the system’s redundancy.“If we lost one head,we could keep on working:There’s no single point of failure.” The vendor comes in for high praise indeed when Robertshaw says his stor- age-solution wish list would consist of “more of this!” He thinks it’s likely that Cinesite London will add another 100TB this year, and it’s possible the company will need additional Mercury 100 heads.“The maximum cluster size is eight heads, then we can start another cluster,” he points out.“So we can comfortably take the system quite a bit bigger.” STARGATE STUDIOS “What we need from VFX storage today is a safe and reliable place to put data and, beyond that, flexibility as needs change,” says Dr. Joseph Meier, CTO at Stargate Studios (www.stargatestudios.net) in Pasadena and Vancouver. “When I started here in 2002 the main problem was that the RAID array we kept data on were RAID-0 stripe sets, so if we lost any disk, the entire data set www.postmagazine.com July 2010 • Post 27

Articles in this issue

Links on this page

Archives of this issue

view archives of Post Magazine - July 2010