Computer Graphics World

January/February 2014

Issue link: https://digital.copcomm.com/i/259450

Contents of this Issue

Navigation

Page 47 of 51

STORAGE 46 ■ CGW Ja n u a r y / Fe b ru a r y 2 014 tacted Venice, California's Open Drives, which offers a scalable and easy-to-manage data storage platform built specifically for the media and entertainment industry. "Jeff Brue at Open Drives based the system on hardware from SuperMicro, and he uses OpenIndiana, which is an open- source operating system that he has fine-tuned for film and media," reports Liesveld. "Our current Open Drives system gives us 50 tb of live storage with the capacity to expand to ap- proximately 150tb through the purchase of additional disks." Savage's production setup offers 50tb of SAS disks that sit behind 960gb of L2Arc cache, which Liesveld describes as an SSD RAID, offering very fast read/write capabilities. "Elements and plates that are used frequently are automatically pushed to the L2Arc cache, and are then served off those faster disks instead of relying on the slower SAS pool." The Open Drives platform uses RAID-Z, which Liesveld likens to RAID-5, but it manages file space better and is self- healing. "ZFS also gives us hot-swap capabilities for backups, so I could create a data pool of disks, push project data to it, and then pull those drives out for archive instead of having to go to LTO-3 or a FireWire-type backup solution," he says. The studio has 10 gb Ethernet (10 GigE) Fibre connections to all its workstations. "That allows us to get real-time 2k stereo and 4k files directly from shared storage to the artist," says Lies- veld. "It also gives us the luxury of working with full-resolution plates rather than introducing proxies into the workflow." While working on the first season of the Netflix series House of Cards, Savage was able to keep the entire show, along with all related elements and reference footage, online and available to artists from start to finish. "Prior to teaming up with Open Drives, we had to do a bit of digital juggling, archiving, and restoring shots and assets to manage space," he adds. The VFX studio provided more than 300 shots for season one, including a CG library, greenscreen car shots, monitors, sky replacements, and a variety of other invisible effects. "House of Cards was shot at 5 k with the Red Epic camera, so considering the volume of work we had coming in, the ability to put together affordable and scalable storage was essential," says Savage. Now that the first season is completed, Savage has moved the critical data to nearline storage, which is essentially the same as the facility's 50 tb setup without the SSD cache in front of it. "We push recently wrapped data to our nearline storage and let it sit for a while before it gets fully archived. That way it's easy to access, and if we need to get data back on the production server quickly, we can," Liesveld says. Savage started on House of Cards' second season this past summer. Sums up Liesveld: "The last thing you want to worry about is, 'Do we have enough disk space?' because then you can't focus on the actual work." ■ CGW Randi Altman is the former editor in chief of Post Maga- zine and a writer in the postproduction industry. With the rapid increase in content that digital filmmaking now generates, it's extremely important for filmmakers to optimize their creative time in post. As production of digi- tally acquired 3D films is becoming more commonplace, uncompressed digital content can translate into hundreds of terabytes per project, and the traditional tools used to manage the content are often overwhelmed by new demands of data acquisition, collaboration, distribution, and long-term, protective archiving. As a premier postproduction facility located in New Zealand, Park Road Post Production was developed by filmmakers for filmmakers, and has worked on some of the largest-budget Hollywood films, independent American and foreign films, and lower-budget New Zealand features and short films. Pressing demands from digital postproduction required Park Road Post Production to dramatically increase its capacity and throughput. For example, on a recent project, the crew processed an average of 6 tb to 12tb of new material each day, and on a really busy day, that figure could reach 20tb. What's more, all this new material needed to be processed and delivered to the client within 12 hours. This quick turnaround is important so filmmakers have the option of reshooting scenes before the sets are struck. While evaluating various systems, the group at Park Road Post Production knew the solution should be flex- ible and could be tailored to fit the facility – and not the other way around. After much due diligence, the studio execs decided to extend the infrastructure using Quantum's StorNext and Scalar i6000 for virtualized tape storage. StorNext touches every part of the workflow, from ingest to archive, at Park Road Post Production. Now, source data is rapidly acquired into a StorNext envi- ronment, either on set or from field LTO-5 tapes, for collaborative processing via multiple SGO Mistika workstations. These workstations access source mate- rial concurrently over dual 8Gb/sec Fibre Channel. The source data and all metadata generated on set and derived through processing is automatically archived to LTO-5 tape via StorNext Storage Manager. Tapes are re- tained within the Scalar i6000 library for quick retrieval back to the shared storage pool for further processing. Tapes are also "vaulted" from the library for long- term archive. With StorNext, the studio can rely on the software to take care of the heavy lifting of mov- ing terabytes of data, allowing the crew to focus on improving creative processes. The Need for Speed

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - January/February 2014