The Effects of Frozen 2
In Chris Buck and Jennifer Lee’s Frozen 2, we return to the world of Arendelle with Elsa and Anna, and we venture further into new places, such as the Enchanted Forest. These new adventures required new approaches to effects for Walt Disney Animation Studios. Here, the team behind the work break down their various challenges on the film.
Artists delivered the mist curtain to the Enchanted Forest by converting the landscape to a height field, then draping the curtain with Vellum Cloth.
Erickson notes that a handy aspect of the chosen workflow was that each sim had a corresponding bgeo sequence of advected particles that were rendered as a separate sparkle pass. “Because we planned for this instanced packed prim workflow, the VDB and bgeo sequences had the same paths except for the file extension,” he says. “This meant that once our packed volume instances were in place, with frame offsets applied, we could just change the extension of the packed disk primitive file and it would convert the volume caches to particle caches in place.”
Artists also calculated the curvature of the curtain geometry to generate emissive volume in folds and crevices and to create a backdrop for any gaps between the instanced sims. “The turnaround on these shots was mostly on the rendering side so we had ample flexibility to iterate interactively and via flipbook,” discusses Erickson.
Once the team had the wall in good shape for the establishing shots, they could then focus on how it would open. First, the art department provided painted keys. Then, effects animators Bob Bennett and Dimitre Berberov set about creating the opening and closing of a multi-layered curtain. “This animation carried the volume and particle instances with it and an additional Pyro sim was run from the curtains edges for integration and extra sauciness,” says Erickson.
The Role of Foundation Effects
Using the TempFx tool in its Foundation Effects tool suite, Walt Disney Animation Studios was able to create lower resolution water proxies as rigged assets in Houdini for early cross departmental collaboration, including for generating water FX. “Having this option was key as it allowed for departments such as Layout and Animation to compose cameras and characters to a quick GL visualization of the water effects,” explains effects lead Ian Coony.
“This was especially useful for Elsa’s character interaction running atop the wave swells of the Dark Sea - giving the character animators foot placement reference on the water. As rigs were created, they could be published as assets with adjustable timeShift controls built in for per shot timing tweaks.”
Similarly, this approach helped in creating early sims for the canyon flood which, says Coony, “forced careful planning of the scale of the canyon set. The initial scale from modelling was adjusted down by 50% once the early FLIP sim tests came back and revealed there was an issue with water speed. Once the speed of the water was agreed upon, these water assets were handed off to Layout for camera staging.”
Making the Nokk
Observation of horse behavior led the design and animation of the Nokk, with effects lead David Hutchins observing that particular attention was paid to emotional states. “Our goal was to preserve the subtle expressions, poses and gestures created in animation while layering on the motion and light response we expect from water,” Hutchins says.
Effects artists began their shots using the Nokk body motion provided by character animation and mane and tail curves from technical animation. The mane and tail curves were used as the source for POP simulations, which modeled fluid-like behaviour using force vectors derived from a local distribution of neighboring particles, implemented as a POP VOP.
The Nokk was enabled via close collaboration between the animation and FX teams.
“This base sim,” advises Hutchins, “was used for three rendered layers: meshed using Particle Fluid Surface and shaded with a water material, rasterized to a volume and rendered with a volume shader, and directly rendered as opaque points for small detached droplets. Finally, using these points as a source, a Pyro sim was run to model very lightweight ‘spindrift’ particles, also rendered as opaque sub-pixel points. The body surface geometry was remeshed to a resolution that would support hydrodynamic motion, and ripples added via layered noise and/or a Ripple Solver which could react to body motion or collide with intersecting objects. For the body interior, we added view dependent volumetric structures which included vector fields defining albedo and emissive light contribution.”
Hoof interaction with the ocean surface also became an important focal point. There was a design goal at Walt Disney Animation Studios to support the look of water interacting with water, as opposed to a hard surface interacting with water. Says Hutchins: “There was break up, deformations and stretching added to the legs by animators, and once in our hands, the goal was to further augment this concept. Particular attention was paid to the shapes of trailing water from the legs, these shapes were established by shaping our particle sources using trailing curves smoothed via resampling and converting to nurbs.”
For shots in which the Nokk transitions from water to ice, two different techniques were used. When underwater, the transition started with ice crystals growing inside the body surface using animated geometry instanced to scattered points, followed by a transition from water to ice material via a Solver SOP generated mask.
Another magical element to Frozen 2 involved fire, both for shots involving the fire spirit and the forest seen on fire. To create the forest on fire, artists first optimized their pipeline with the help of the technical director (TD) team in order to have an asset library of trees that would already contain the fire.
The layout artists and effects artists, normally separated by several steps in the pipeline, then collaborated heavily early on in production when it came to fire in the Enchanted Forest. “Instead of waiting for the effects artists to implement fire in a shot, the layout team was able to visualize where the fire would be and which trees would be on fire during their earlier stage of production,” outlines effects lead Marie Tollec. “This enabled the layout artists to place the camera accordingly, in order to have interesting compositional framing as well as enable them to identify which trees would be on fire themselves, rather than have the effects department to do it later in the pipeline.”
The tool that enabled this - dubbed Firestarter - was developed by the TD team in Python. Through viewport selection, a layout artist could highlight the trees to be set on fire. Those trees would appear pink in the Maya viewport and layout would be approved this way. On the effects side, the simulation was made in Houdini using Pyro on each tree at the origin. Then the fire VDBs were published, alongside their Hyperion shader material as well as the particles used for the embers. (Hyperion is Walt Disney Animation Studio’s proprietary renderer).
“These fires were published at the asset level of the pipeline. This way, every time a tree was dropped from the asset list into a 3D scene, the artist manipulating the tree, from whatever department they were in, had the choice to select the onFire version of the tree that would contain all the fire information.”
Marie Tollec, FX Lead
That approach enabled all the set-dressed fire, which was then supplemented with custom pyro simulations added by effects artists. The sim interacted with elements and characters in the scene and would be ‘put out’ by Elsa’s magic. Tollec adds that the team worked with the film’s production designer to come up with a more magical look to the flames and ended up re-mapping the heat field to push the colors towards the purple/magenta range.
Finally, in relation to the fire spirit itself - which took the form of a salamander - artists relied on a pyro simulation with embers only present at times of extreme emotion. Tollec also comments that “the material for the salamander had two variants with a spicy (on fire) and a non-spicy version. The simulation itself was made on the moving salamander, with traditional techniques to pre-slow down/minimize the motion when it was too extreme.”
Gale, The Wind Spirit
Most of the spirits in Frozen 2 are effectively ‘visible’ in some form or another. The wind spirit, Gale, however, was necessarily ‘invisible’, and only showed up as the influencer of the environment around it. This led to a need to animate environment assets to a level not previously done before at Walt Disney Animation Studios.
To start the process, character animators were provided with a rig in Maya that allowed them to visualize the position, directionality, width, and length of Gale’s influence on the environment. Effects lead Ben Fiske explains this further: “While most animated this rig as they would any other character, VR enabled rigs were also available, allowing some to dive directly into the scene and animate via body motion alone. From there, curves were written out as bgeos containing common point attributes that would help drive the interaction.”
The impact of Gale necessitated a deep collaboration between FX and the environment teams. “The environments,” says Fiske, “were authored using a variety of proprietary procedurals that arrived in a variety of different formats but, critically, all had hooks leading back to geometry data stored on disk as bgeos. By having bgeos alongside Disney’s CAF format, effects artists were able to use custom OTLs to quickly load an entire environment’s worth of instanced procedural environment assets using delayed load packed primitives, animate them, and write those back out into the pipeline.”
“By using packed primitives, the data sets being manipulated were extremely lightweight and, in most cases, assets like fallen leaves and simple plant life could be animated on a single point per plant basis, never needing unpacking. This allowed artists to leverage DOP solvers like POPs, and in some cases, the new Vellum solver, resulting in quick iterations and extremely simple approaches.”
Ben Fiske, FX Lead
Gale’s presence in the film sometimes takes the form of a tornado and a billowing flurry of snowflakes. The tornado appearance relied on Houdini’s Pyro solver in conjunction with instanced volumes and debris that were manipulated in SOP context. “For shots shown inside of the tornado funnel,” states Fiske, “this workflow of using SOPs for the instanced volumes and debris allowed for specific art direction with regards to speed, amount, and hitting story points important to the shots.”
Then, for the snowflakes-related shots of Gale, artists advected points through a billow Pyro sim at the origin that were point deformed into place using the data provided by the animation department. “These points were later instanced with snowflakes and smaller particulate, and an additional volumetric pass was layered in using the base Pyro sim,” says Fiske. “Simple solutions for complex looking problems.”
The Earth Giants
Earth spirits, in the form of giant rock formations, were characters that were realized in partnership between animation and FX. Fiske notes that the studio was able to leverage work that had already gone into animating trees and vegetation with Gale. This included utilizing constrained curves run through Houdini’s Vellum solver; here, artists expanded on the idea by building out automated rigs in SOPs that tracked the footsteps of each of these giants.
“Using these footsteps,” says Fiske, “a growing ripple was instanced that all of the vegetation rigs were looking for to apply a downward shake force, allowing for not only a falloff on intensity, but an offset in timing.”
The Earth Giants, and their interaction with the forest environment, benefited from automated rigs in SOPs that tracked the giants’ footsteps.
In addition to the tree shake, the giants interacted heavily with the surrounding forest. Forest interactions were aided by the Vellum curve-based workflow established with Gale. This allowed for direct collisions with branches and the Earth Giants, while constrained sectioned RBDs were also set up for all of the tree modeled trunks, allowing for a complete, deformable tree simulation.
“Beyond this,” advises Fiske, “logic was added to release leaves from the trees with enough force applied to the branches, which were then simulated in POPs. With all of these pieces in play, the true chaos of angry Earth Giants walking through a fully dynamic forest was possible, down to the last leaf.”
For the dam sequence, Walt Disney Animation Studios had to not only simulate the rushing water, but also its destruction. “As Marie Tollec outlines, “the destruction of the dam was to be at the climax of the movie and we needed to create a complex workflow that would include almost every department to set it up correctly, from visual development, modeling, layout, effects, animation, and lighting.”
Since the sequence would be very much effects-driven, the workflow for destroying the dam was approached differently than most in the film. The effects and the way the dam would shatter would affect not only the layout and camera placement but also come into play before the animation. This was so that Anna, who runs on the dam, would be able to react to the environment being destroyed around her, in addition to her feet making foot contact on the structure.
The dam break needed to simulate an area 2km long and 300m wide.
“The workflow,” attests Tollec, “allowed for a great performance in the character emoting as well as an optimized model construction and destruction. The dam appears in a number of shots and needed to be optimized for both closeup and far away shots. The model was also designed so that every piece would be unique in the combination of size, texture, displacement, moss and construction marks.”
Once the rigid body simulation was director-approved, it would go to layout for camera readjustment, then character animation and technical animation, before circling back to effects where, says Tollec, “we would add secondary dust, debris and particulates, alongside running the water simulation. In most cases, the water simulation was run after the destruction, with the water reacting to the RBD rather than the other way around. The continuity of the action also allowed us to optimize the simulations by running a longer simulation which would span over several shots. This was also helped by the fact that the events and timing of the sequence were choreographed through rough previz ahead of time by our Heads of Effects.”
Two musical sequences in Frozen 2, ‘Into the Unknown’ and ‘Show Yourself’, called for an alternative take on Elsa’s magic effects. In them, a mysterious force dazzles her with visions of the elements of nature and premonitions of events to come.
“These effects necessitated a high degree of specificity which we achieved using a combination of procedural and keyframe animation techniques,” details effects lead Alex Moaveni. “We would dial-in the animation on an individual component, such as a deer or tree, then scale up using Python scripts and our proprietary scene archive distribution system, Toolshed.”
“We added a hand-animated quality to these effects using a rotomation toolset we wrote using Python wrapped around Houdini’s stroke SOP. Animating curves or surfaces natively gave us consistent UV’s to stick instance points to, important for controlling motion.”
Alex Moaveni, FX Lead
For some shots, adds Moaveni, artists keyframed individual points, while for others they used a procedural VEX method in a ‘quasi-POPs-like’ way to sample positions on curves or surfaces using a normalized age attribute. “To our benefit we could scrub in real-time and use ramps or curve functions to affect the apparent energy of the points.”
Amongst all of these other effects challenges, Elsa’s magic still plays a key role in the storytelling. Effects lead Thom Wickes identifies this as being all the way from the “lyrical, playful magic in earlier sequences to the more powerful magic she uses when extinguishing magical fire or fighting for her life in the dark sea.”
The broad variety of magic effects meant the studio adopted a more flexible, modular toolset for this work. “Most magical effects were largely SOPs-based,” says Wickes, “with simple geometry used for blocking in the effects. We used curves for the more whimsical effects, and larger cone shapes for the aggressive blasts. This allowed effects artists to focus on placement, shape, and timing, before generating the final data.”
Ice visions exhibiting Elsa’s magic took on a hand-animated quality.
“The more playful magic relied more on particle snowflakes advected along curves in POPs with supplementary mist and glow volumes,” continues Wickes. “For the larger blasts, we used procedural noise textured volumes moving through a deformed coordinate space, this data was used to generate additional mist, particles, and frost material overrides.”
Effects and Storytelling
As it had done on several recent productions such as Ralph Breaks the Internet, Moana, Zootopia and Big Hero 6, Walt Disney Animation Studios sought to leverage continual developments in FX for Frozen 2. And, as the observations from the team above show, this was also done in collaboration with the rest of the entire studio, making that FX work fit right into the final frames.