Post Magazine

June 2015

Issue link:

Contents of this Issue


Page 42 of 51 41 POST JUNE 2015 EMERGING TECHNOLOGIES uring the recent 2015 NAB Show in Las Vegas, The Foundry (www. created some buzz by presenting its research in virtual reality (VR) with technology demonstrations at its booth. According to company co-founder and chief scientist Simon Robinson, The Foundry was sharing some "early tech previews of the com- pany's upcoming set of tools that will do live-action VR work inside of Nuke." Post recently had an opportunity to speak in more detail with Robinson about why he thinks VR is an import- ant emerging technology for the VFX industry and also about the company's involvement with Project Dreamspace, a three-year European Commission-funded project tasked with researching and de- veloping tools for creative professionals in visual effects. Can you tell us about the work The Foundry is doing with VR? "We had some contact with a few com- panies about a year and half ago who were interested in producing live action/ VR experiences. Some people call it om- ni-directional stereo, or immersive video, but loosely speaking, it's any experience that's based around capturing something using lots of live-action cameras. Some of the people we talked to early on were typically building their own customized rigs with GoPro cameras and the like, and just going out and shooting stuff." Where is the technology right now? "Users are relying on the increasing af- fordability of cameras and creating these rigs. If you see them, they are like big balls of cameras mounted on tripods and they capture everything, in all directions, at the same time. You can then produce an experience where you can sit in with a headset or a stereo headset and actually see stuff in 3D. It's quite compelling. But it's also bringing up a whole load of pro- duction and on-set challenges. And, for all of them, there's the increasing ques- tion of: 'How do I do post production on this?' As you can imagine, that side of it is quite intense. It's hard to know how to do editorial, sound and, in particular, how to assemble all of the data from dozens of cameras all in one spot at one time and do the kind of post you would expect to do on a shot. "We found that there were more and more of these companies doing this kind of stuff, and typically they were all using Nuke as a core part of how they got the work done. And I think Nuke is a natural place for it. It's a great environment for taking a nodule of multi-camera data and bringing it together. We went to dozens of these kinds of companies who were doing this type of work and picked out the common set of problems they were facing and put together this prototype set of tools. Then, we brought the proto- type to NAB and demonstrated it. "We had some great footage a cus- tomer gave us, I think from a 12-camera setup, and then we went through in Nuke how to import the data, how to calibrate the cameras and make sense of it, and how to do some post work on it to produce a final output which we then demo'd running on an Oculus Rift [virtual reality headset]. "We found that there were at least two or three times as many people doing it as we thought. We came back from there with a neat idea of how much work we still have to do." You want to bring VR tools to Nuke? "Yes, because people were running up against the same brick wall, over and over again. We have knowledge of how to solve those problems, but that knowl- edge isn't available in Nuke yet. "If we go back a few years, to when the whole stereo 3D movement was kick- ing off, around the time of Avatar, we put together a bunch of technology solutions to deal with multi-camera challenges. Then, it was specifically for two camer- as, which was hard enough to do at the time. But now, a lot of what we're put- ting in place to help solve these newer challenges is almost like a sequel to that technology from a few years back. "People in post are so talented that I always suspect, given enough time, they can make anything work themselves. But clearly tools designed by us to help solve that particular problem are good to have." Is this related to Project Dreamspace? "Yes, Project Dreamspace is mostly about what happens on-set with virtual environments where you want to mix virtual environments with real actors and see what's going on. Our involvement is very much around compositing. The boundaries of where tools get used is really blurry these days, so even tools like Nuke, which is essentially a compositing system, is being used a lot in previs, on-set, and a lot in post. The interesting thing for us is, how can we improve Nuke to make it a better candidate for on-set work, so if you're a director and you just shot something on greenscreen, you want to be able to see how what you just shot might look after it's composited? The applicability of this is huge and no one knows where it's going. The Foundry is putting the effort in and supporting emerging disciplines like this. We think it's important and we will get results." CREATING AN EASIER PATH TO VIRTUAL REALITY THE FOUNDRY DISCUSSES ITS RESEARCH IN 'VR' AND BRINGING NEW TOOLS TO NUKE D BY LINDA ROMANELLO The Foundry is working to add VR tools to its Nuke software. They previewed the technology at NAB.

Articles in this issue

Links on this page

Archives of this issue

view archives of Post Magazine - June 2015