Computer Graphics World

July / August 2016

Issue link: https://digital.copcomm.com/i/711516

Contents of this Issue

Navigation

Page 36 of 75

j u ly . a u g u s t 2 0 1 6 c g w 3 5 Yet, Everest VR is more than a simulated physical experience, as educational infor- mation about the mountain, climbers, and sherpas is presented along the route. M A G I C M O U N T A I N On the virtual mountain, climbers will experi- ence a range of snowfall and wind, as well as changing skies, all created with simulations. For the snow effects, the artists used Nvid- ia's new Turbulence scalable fluid simula- tion, part of Nvidia's GameWorks, which enables artists to change parameters such as fluid viscosity, turbulence, and particle mass to generate particle effects based on fluid dynamics. "We are using millions of particles in a 3D fluid simulation in real time," says Hardar- son. The particles and turbulence fields, generated within the game engine, collide with the environment and form swirls and blow from side to side – bouncing off the hikers' gloves, for instance. To create snow in the distance, the artists used Side Effects' Houdini and Maya. "There are over 50 different words for 'snow' in Icelandic," says Hardarson, noting there is snow everywhere in Everest VR – on the ground, in the air. "There are many types in the experience, and we used multiple methods to produce the effects, from simple shader trickery on sprites and planes, video cards created in Houdini, and real- time fluid simulations using GameWorks." As all artists know, getting detail from the white snow is not an easy task. With such a stark, white environment throughout this application, lighting became especially difficult. "Snow is super hard; it's a massive challenge from a visual standpoint. It's not just the subsurface scattering, but the way it reflects light into all directions from within the subsurface that makes it quite tricky," says Hardarson. "It is very easy to get wrong." The RVX crew already overcame this obstacle for the movie, and now the artists had to solve it again within the real-time environment through shaders and lighting. "Our familiarity was a big help, and we were able to iterate with confidence that we were going in the right direction until we had a representation of what we were aer." Of course, that familiarity resulted from the film as well as real-world reference outside their office windows in Iceland. In addition to the snow, the artists used real-time cloth simulation, with wind beating against the climbers' suits. The cloth sim was generated using Nvidia's PhysX, part of its GameWorks toolset. To optimize the VR development, the group also used Nvidia's VRWorks, a suite of APIs, sample code, and libraries for virtual-reality development. Additionally, the crew used Simul So- ware's TrueSky real-time cloud simulation, and provided volumetric clouds and atmospherics. The development work at Solfar, which is still in progress, is being done on high-end PCs with Nvidia 980TI cards. The studios' platform machines contain i5 processors with Nvidia 970GX cards. "You need lots of power to run this at 90 frames per second," says Hardarson. "But, it's hard to maintain that 90 frames throughout." D R I V E N T O N E W H E I G H T S The VR experience comprises 108 billion pixels and 13 million polygons – the equiva- lent of 14,000 shots taken with an iPhone 6 camera. Creating such a realistic, real-time experience resulted in a number of produc- tion challenges. According to Hardarson, one of the biggest technical challenges was cramming the experience onto an Nvidia 970GX card with 4 gb of video memory. "A big part of the last two to three months has been getting this to run on consumer- level machines at 90 fps all the time," Hardar- son says, noting they are reducing the memo- ry footprint through the use of Graphine's Granite texture-streaming middleware. "VR is so new, and there are so few people pushing the extreme limits like we are, that there's not a lot of people who we can ask for advice. It's great to have companies like Nvidia and Epic to turn to, to help us squeeze the most graphic quality from the imagery," he continues. "Aer all, we are all in this together, to create the most compelling experiences for VR." In all, the group re-created roughly a seven-square-kilometer area for the high-resolution "experience," and approxi- mately 50 kilometers of the low-resolution area in the distance. As of press time, the team was in the final stages of production, with a target completion date of this summer. Everest VR will be released initially for the HTC Vive and will be available through Valve's Steam. In the fall, it will be available in various app stores and will be supported on the Oculus Rift. While Everest VR has been an educa- tion experience for both studios, each plans to apply the lessons learned on the virtual mountain to another high-end VR project. "We plan to take a similar voyage and route again, which is what happens when you push the envelope," says Hardarson. "It's good to push bound- aries, especially in this new medium. If it were easy, we'd probably be doing something else." ■ USERS WILL ENCOUNTER VARIED WEATHER CONDITIONS AND CHANGES IN TIME OF DAY. Karen Moltenbrey is the chief editor for Computer Graphics World.

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - July / August 2016