Computer Graphics World

July-Aug-Sept 2021

Issue link: http://digital.copcomm.com/i/1399888

Contents of this Issue

Navigation

Page 39 of 67

38 cgw j u ly • a u g u s t • s e p t e m b e r 2 0 2 1 primates to study how their features change as they age, "keeping in mind that Kong is an anthropomorphized ape and has a lot of hu- man characteristics," says Hirota. They also studied the physicality of older bodybuilders and weight liers. Two setups were used to handle Kong's muscle simulations. For shots that required fine muscle and tissue detail, they mod- eled the skeleton, muscles, and tissue with thickness for the fascia and skin, and then the group used Ziva Dynamics' FEM Solver, a physics-based muscle simulator, which gave a believable physicality to the creatures. For efficiency on shots that didn't require such hero simulations, the artists developed a body muscle system in Autodesk's Maya with an anatomical procedural jiggle rig that ran in near real time for rapid iteration. The artists could mix the results from Ziva with the real-time Maya jiggle rig on a shot-by- shot basis. Scanline further developed an auto sim- ulation process for the muscles, jiggle, and hair, which could run over a series of shots to increase efficiency even more. On the aesthetic front, getting the right look for an aged Kong required some back and forth between Scanline and the director and studio. "We needed to get the right amount of beard coverage and aging in the groom," Hirota explains. Kong has a number of different groom states throughout the film — dry, wet, oily, and burnt — which had to be tracked for continuity. Artists generated his hair using Chaos' VrayHairNextMtl. The groom was done with Maya's XGen, and the simulation in Maya's nHair. "We spent a good deal of ef- fort on introducing the aging and placement of both gray and white hairs in his groom," says Hirota. "Once we were happy with the base look of the fur, we developed the other variants that are seen in the film, including the various stages of wetness and a dusty and oily version from his final confrontation with Mechagodzilla." Scanline, in fact, completely overhauled its hair system to allow for interactive manipulation of the guide hairs, and created a multi-shot hair simulation tool. All the fur elements were created by sculpting guide curves. "At the outset, we would start with a smaller amount of guide curves and try to push the groom to about 70 percent completion. We were able to change the parameters of all the grooming attributes on the fly without having to redo everything, as is the case with a purely sculpting-based grooming workflow," Hirota says. "Although we were using a photorealistic approach, we were still able to utilize some of the purely sculpting-based grooming tool features to add finer details when required." As Hirota points out, Kong's groom was so complex that the team was limited in terms of efficiency and iterative abilities. To solve that, they split the ape's groom into 10 smaller sections, resulting in faster hair generation and preview times within the viewport. They also harnessed the viewpoint render features in Maya's Viewport 2.0, which provides full shading for fur as well as lighting and shadow previews. "Being able to view something that closely resembled how the rendered fur would look without having to go through render tests meant we were able to iterate much faster," he says. In all, Kong has more than six million hairs (6,358,381 to be exact), which were simulat- ed in every shot the great ape is in. One of the more difficult aspects of Kong, according to Hirota, was due to his human-like qualities, particularly when expressing human-like emotions. "As we knew Kong's performance was going to be crucial to the story and that he was going to have to convey a wide range of emotions, we dedicated a focused effort into rebuilding our eye model for Kong's eyes," he says. The group worked on accurately replicat- ing the shape of the cornea to refract light and interact properly with the iris, as well as added a thin membrane where the eye veins were so they weren't simply painted onto the sclera. They further added a tear film and had full control of the meniscus, enabling them to control the mix of oil and water that sits on the surface of the eye. "All these additions meant we were able to get proper colorization and increased realism into our eye model," Hirota says. Kong exhibits a wide range of emotional states throughout the film, from tender mo- ments with the human Jia to epic moments of rage. "Especially for moments when Kong needs to express specific emotions, we would motion-capture a full performance for Kong (both body and face)," says Hirota. This was done by implementing a new facial capture system using soware from Faceware Technologies, and throughout the process, harnessed the feedback from machine learning to help improve and refine the targeting of the human performance to drive Kong's face. Additionally, the group referenced a study on primate FACS from the University of Portsmouth to discern the differences between the human and chim- panzee faces. Kong has various groom states in the film, from wet to dry and more.

Articles in this issue

Archives of this issue

view archives of Computer Graphics World - July-Aug-Sept 2021