Computer Graphics World

April/May 2012

Issue link: https://digital.copcomm.com/i/65907

Contents of this Issue

Navigation

Page 15 of 51

n n n n Simulation visor Willi Geiger, joined the Battleship crew 18 months before the movie's release date. Geiger had set up the pipeline for Poseidon, had worked on the second Pirates film, con- sulted on Pirates of the Caribbean: At World's End, and helped create a wave of water that flooded Washington DC in Evan Almighty. "I looked at the bidding sheets for Battle- One of those simulation experts, CG Super- ship and compared that work to the scope of the work we had done before and how long it had taken," Geiger says. "That's when I started panicking. It was double anything we' before, and we had half the time. I realized that we' d done any hope. It was the longest and most chal- lenging project I've ever been involved with." d have to rebuild our tool set to have "When Willi came to me, Nick Rasmussen, and R&D in general, we looked at what they wanted to do, the time frame we had for each shot, and the scope of the work," Hankins says. "We're production-driven, so the work the developers get is what's most important at the time. But, some of us have fluid and water specialties, so we were pretty excited about getting to work on new water simula- tions. It wasn't as scary to us as to someone like Willi and Doug, who have to deliver shots." Geiger gave the R&D team a mandate: make the water simulation more accurate, detailed, precise, reliable, and five times faster. That is, the time from the artist starting a shot to delivering a simulation needed to be five times faster. To achieve that goal, the team the fluid system for Poseidon, it wasn't multi- threaded. "We based our previous system on MPI [Message Passing Interface specification] for communication between different pro- cesses on different or multiple machines, but relied on the infrastructure built on top of our rendering scheduler to grab machines and spread the work out," Hankins explains. That meant the artists couldn't predict how long a simulation would take; the result depended on available resources. "If you couldn't get all fast machines, the weakest machine in the group would slow the whole simulation down," Hankins says. "With a threaded model, we could run the simula- tion on one [multi-core] machine and predict the simulation completion time. But we had to learn how to do threading properly. We went through every single algorithm—and, a fluid system can have many components—to find the best way to decompose the problem. MPI was sometimes the best solution, some- times not." matic: "With the old-style system, we had to grab exclusive use of as many as 16 machines and lock everyone out of them, and coordi- nate those 16 machines for however long the simulation needed to run, which could be days, and it just became impossible," Geiger says. "Now, a simulation that might have tied up eight machines for two days can run on one eight-core machine and finish in a few hours." Moreover, because the simulation ran on one machine, it was more reliable. So, check two items off Geiger's list: speed and reliability. "Multi-threading was without a doubt one of the biggest things that allowed us to have any hope of finishing this movie," Geiger says. When they finished, the difference was dra- At top, Industrial Light & Magic artists simulated water using new tools, fire using the studio's Plume system developed earlier, and (at bottom) destruction with existing rigid-body dynamics systems to create action sequences such as these for Battleship. "Battleship Water Department." Among the team members were Smythe, who oversaw all the areas of CG technology; Geiger, the CG supervisor; Nick Rasmussen, who leads the effects simulation group in ILM's R&D department; R&D engineer Rick Hankins, who stayed with the show from system devel- opment all the way through final shots; and R&D developers Andrew Johnson and Cliff Ramshaw. To confront the problem, ILM formed a 14 April/May 2012 spent between six and eight months develop- ing new "sub-tools" and new methods. "We realized that we could hold our largest sims in 32gb of memory or so on a single ma- chine," Smythe says. "That wasn't the limiting factor. The biggest limiting factor was time. It was how long we wanted to wait, not how much memory a machine had. So we decided to make the simulations faster. And then, they could be as complicated as we wanted." Although ILM's engineers had parallelized Details, Precision, Accuracy And that was only the start. "We measured where all the time was going," Hankins says. "We knew we would have to bring in junior artists, not just high-level TDs, so the simula- tion needed to be less technical from the user's standpoint. It needed to behave correctly." Until this film, to create the ocean waves, splashes, spray, mist, and foam, the simula- tion artists would run a base water simulation, which provided the undulating surface, and then separate particle simulations for splashes, spray, mist, and surface foam, which produced the scale, complexity, and volume. "We' all these individual simulations, which was a headache for the artists and for the composi- tors down the road who had to piece all the parts together," Geiger says. "And, of course, it wasn't physical. Water doesn't behave in differ- d have

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - April/May 2012