Post Magazine

January / February 2019

Issue link:

Contents of this Issue


Page 45 of 51 44 POST JAN/FEB 2019 STORAGE uring my three years at Sarofsky, I've archived more jobs over the past 12 months than I did during the previous two years combined. As we start 2019, I'm pretty sure that all studios are facing the challenges that come with the ever growing sizes of our data, many of which stem from growth in camera technology. With resolutions getting higher and higher, and many projects needing to be revised year after year after year, studios like ours are constantly scrambling to have solid plans in place to cover them from ingest all the way to through to archival. As a wide-ranging problem that spans all forms of visual artistry — from entertainment to advertising, experiential and beyond – I thought it would be help- ful to share our approach to managing this ubiquitous challenge to the benefit of our clients, and the peace of mind of our producers and artists. PREPRODUCTION In preproduction meetings, we are bringing IT in earlier than ever to plan how jobs will go through our pipeline. Our objectives are to determine potential file sizes, upload/download requirements and security, and to create a custom data plan for every job. Yes, every project gets its own discrete treatment, and obvious- ly, some assignments require extra care and handling. On the backend, my team starts its planning immediately. To facilitate this, we keep up-to-date records of which jobs are ready for archival, and make sure to have enough space on our Primary and Secondary storage solutions, which are described in more detail below. PRODUCTION AND POST At Sarofsky, we pride ourselves on having multiple high-speed locations for ingest via out-of-house hard drives that all arrive at USB-C/USB3 speeds. We are using a new iMac Pro for USB-C ingest via Fiber Channel, along with three Mac Pro workstations, all connected via Fiber Channel as well. On our live-action shoots, we use an OWC Thunderblade as a portable hard drive to get the fastest transfer speeds, then move the data quickly via the iMac Pro. During post, producers are also run- ning into another issue: Final deliverables are not only larger in size, their numbers and applications have multiplied expo- nentially, with up to 50 different social versions and interactive elements typi- cally appearing on the slate these days. Submitting those has now become its own mission-critical workflow. To dial all of this in, we have two dedicated transfer stations that have an isolated/faster one Gigabit Internet path, both for uploading and downloading deliverables and con- tent. This dedicated increased speed is crucial during production, especially with the standard "extremely tight" deadlines. ARCHIVAL After a job has completed its course and hits the airwaves, it's time for IT to switch the project over to the archival phase. Monthly, IT coordinates with producers to discuss which jobs can be placed on the archival list. Once approved, the job is removed off of the primary storage, but it still lives on our secondary (back- up) storage, where it will live for another six months or longer, depending on the specifics of the job. After looking at trends in our unar- chive requests, we have determined that the six month holding period is a sweet spot, as most advertising work comes in for revisions during that amount of time. Having the project living on-premises for those six months greatly shortens the time it takes to restore a project vs. ac- cessing it via the cloud. This bottleneck is one that has come to light through testing. Entertainment work, on the other hand, has a one-year hold — as we have noticed that the usual lifecycle is for a show to come back for revisions before its new season debuts. Our backup also synchronizes with our cloud storage, ensuring that we have three points of failure if a project ever gets corrupted or lost. After the six month period, the job lives on the cloud in an archival state until it's needed. As a company, we are figuring out contractually how long to keep archives in the cloud, as this is a new frontier for us. Early discussions have us keeping projects for five years in the cloud and then determining if the project needs to be kept around longer on a case-by- case basis. WINNING TOMORROW'S BATTLE FOR SUPREME STORAGE SOLUTIONS BY MARK GALAZKA IT MANAGER SAROFSKY CHICAGO, IL WWW.SAROFSKY. COM ONE APPROACH TO MANAGING THE STORAGE CHALLENGE D Sarofsky is addressing much of the same storage challenges other studios face.

Articles in this issue

Links on this page

Archives of this issue

view archives of Post Magazine - January / February 2019