Computer Graphics World

JULY 2010

Issue link: https://digital.copcomm.com/i/14320

Contents of this Issue

Navigation

Page 29 of 83

n n n n Visual Effects hand, could produce the necessary 3D fire simulations, but that system runs on CPUs and couldn’t provide the quick turnaround that Hammack and Maury wanted. “Chris [Horvath] inspired us to use the GPU for simulation and rendering,” Maury says. “He taught us that it could be a reliable tool.” As did Horvath, Maury used GPUs from Nvidia; for Airbender, the Quadro FX 5800 cards with 4gb of memory. By this time, CUDA 2.1 was available, and using that framework, Maury developed a 3D fluid simulator and volume renderer now known as Plume (see Viewpoint, pg. 10). “Plume is a fairly traditional smoke solver with an artist-friendly combustion model in that it allows artists to quickly come up with fire and flame looks by controlling the render- ing,” Maury explains. “Artists can iterate and show results as a final render, rather than show an intermediate visualization.” And that, according to Hammack was key. describes Plume’s rendering engine. “It’s a full- on ray-marching renderer with fire and smoke built in. It casts rays to do internal shadows. You associate a density ramp and color ramp with the values inside the simulation, and as you run down each camera ray, you accumu- late density and color, and that gives you the final color.” For smoke, the system calculated self-shadows as it ran, using the light source from the fire and external sources. “It’s a great tool,” Pearson says. “It ran 50 to 100 times faster than our software equivalent at the time. We could produce a rendered simula- tion in a couple hours rather than a couple days.” And, not only for fire. As Plume developed and the crew became facile with the tool, the artists began using it to bend air and dust, as well. “With this technique, we began to use it for applications we had never considered using a full simulation to do before, even little dust hits,” Pearson says. “We can do something physically correct without waiting until to- generate fluid dynamics, and then an up- res’ing technique to create the look. “We saved out the grid data from the simulation, which had nice air currents and swirls, and fed a cou- ple billion particles through it,” Pearson says. “Ten, we ray-marched through the particles. Each particle added just a little density to the grid. It gave us a lovely, wispy look.” Air bending, it turned out, was more difficult conceptually than technically: Te team consid- ered contrail vapors before deciding to pull the look from the surrounding environment. Col- ored lights that matched the environment tinted the neutral white base color of the particles and hinted at the surrounding world. “Te most fun shots, though, were when we played with multiple elements together,” Hammack says. “We could have a gust of air disrupt the base fluid simulation that the fire is driven through. You get a sense that one affects the other because they do.” For example, at one point in the film, Aang deflects fire shot at him by bending the earth; that is, generating a wall of dust that causes the fire to swirl around it, carrying dust particles with it. Te bending happens during the four- and-a-half-minute shot. “Te fire system lets you introduce geometry At right, Aang (Noah Ringer), a young avatar and Airbender, learns how to manipulate water from a water-bending friend. Above, effects artists at ILM used sculpted geometry to create source and target objects for the studio’s Physbam fluid-simulation engine. “We rarely saw anything but final renders out of the system, so we didn’t have to judge a geometric representation of the fire before we got to a rendering stage,” he says. “We real- ized that for simulations like fire, it’s about the iteration, about the speed with which you can turn around the takes.” To overcome the GPU’s memory limitations, they developed a system to chain grids together. “Because the fire travels in one direction, we can put the grid domains next to each other and have one feed into the next,” Hammack says, “so we were able to maintain resolution over a large space that way. We built the system from the beginning to optimize memory.” Dan Pearson, digital production supervisor, 28 July 2010 morrow morning. We get a whirling motion, and things react from collisions nicely, and it generally looks more believable. At the least, it’s a great place to start.” Earth, Wind, and Fire Aang bends earth in only two instances in the film, and for those, the studio used a combi- nation of the fracturing technology developed for Indiana Jones and the Kingdom of the Crystal Skull (see “Keys to the Kingdom,” June 2008) and Plume. “We could fracture the earth ge- ometry, and from that, drive particles that we fed into Plume to generate the dust,” Ham- mack says. For air bending, the crew used Plume to as obstacles, so we fed the walls of dirt directly into Plume, and the fluid dynamics wrapped the fire around it,” Hammack says. “Ten, since we were doing dust in Plume anyway, we could feed the dust into the fire flow.” Before, big effects shots such as that, which required multiple simulations and long ren- der times, took someone with deep technical knowledge to run. On this film, effects artists ran the shots. “Tis movie was great for me,” Hammack says. “We finally have a tool that is interactive enough to break the tradition of having the most technical person in the com- pany run the big shots. I have to say that fire bending was a lot more fun than water.” Water, Water, Water Te water simulation tasks ranged from a huge wave, to tentacles of water twice the size of a person, to a ball of water that rises from

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - JULY 2010