The water droplets on the kids fighting on the beach in ‘Avatar: The Way of Water’ took 8 days to simulate

Covering the CG characters with photoreal drips via a thin film solver was just one of the many water simulation challenges on the film.

At one point in James Cameron’s Avatar: The Way of Water, the brothers Neteyam and Lo’ak scuffle with some Metkayinan teenagers amongst the shallow lapping waves of a beach. It’s a stunning scene, not least because of the presence of fine water droplets on each CG character. The simulation for those would take days to compute.

With so many water scenes for both above and below the water, the team behind the work–which included artists at the VFX studio Wētā FX and the engineers who are now part of the Wētā Digital/Unity acquisition–needed a comprehensive water toolset for the job.

Indeed, such a toolset specifically for The Way of Water was embarked upon as far back as 2017, with Loki, an in-house proprietary simulation framework, being at its center. Here, Alexey Stomakhin, principal research engineer at Wētā Digital x Unity, tells befores & afters about the extensive R&D process for water simulation on the film.

(Stomakhin and Steve Lesser, Sven Joel Wretborn and Douglas McHale just took out the Emerging Technology Award of the Way of Water water toolset at this year’s VES Awards, with several other VES Award trophies also going to those who worked on the film in a number of categories, including directly relating to water).

Buy Me A Coffee

This shot of Payakan and Lo’ak playing together in the water involved multiple solvers running in tandem. Image copyright © Wētā FX.

The origins of the water R&D

Alexey Stomakhin: We had a special Water Development Project established back in 2017. It focused specifically on the upcoming water challenges. It was treated as a miniature show consisting of a number of challenging shots. The goal was that we we would not simply strive to final those sample shots, but rather use them over time to continuously refine the pipeline; so that when we got 2000+ water shots to work on for the movie–actually The Way of Water had 2,225 shots with the water elements in them–and a lot of FX people came on board, the toolset would allow to deliver these shots with consistent industry leading high fidelity.

We got a hint of what was to come in Alita: Battle Angel

Alita: Battle Angel came out at the end of 2018. When I started at Wētā, I got to work on this one shot in the movie where the main character dives into the water to get inside of a spaceship and then emerges back from the water right next to the camera. That shot was where a lot of the water tools were battle-tested.

The supervisors, Joe Letteri, and James Cameron, were very particular about how they wanted the water to roll off the face and how the cloth and hair needed to be coupled with the water, how it all needed to look natural and organic together. It almost felt like we were proving the worth of our techniques for a future production.

The Way of Water’s over-arching water architecture

The toolset consists of a number of distinct solvers. Initially, there were attempts to try and envision this single representation water system that would simulate everything at once, but it turns out it is not quite computationally feasible due to scales ranging from large 10- or 100-meter splashes and wakes, all the way down to sub-millimeter drips on skin.

So even though the same fluid equations get solved under the hood, we had to make certain assumptions and simplifications for each of the different scenarios to ensure we keep compute times within reasonable bounds.

This video is from the SIGGRAPH 2022 Technical Paper: ‘Loki: a unified multiphysics simulation framework for production’.

On stage: procedural water for stage

Water started with a procedural GPU-based deformer that would be used on stage. It provides an approximation to how waves would behave far out in the ocean. The solution is based on the assumption that you can combine multiple Gerstner/Stokes waves to create arbitrary wave patterns. It is somewhat simplistic, but at the same time there is a certain physicality built into it, with Tessendorf and TMA being popular choices for frequency spectrums.

This approach gave the right idea to the director of how the water would look like, not only from motion but also from the lighting point of view. Then, for us, it would provide a physically plausible starting point to drive more complex simulations.

Loki as the centerpiece and state machine

Most of the water tools we used are based inside of our proprietary simulation framework, Loki. The most prominent one is the Loki state machine, our airborne spray system. It consists of a number of distinct solvers that excel at different scales, but at the same time, work and run in tandem. We have solvers for multiple water states: bulk, spray, mist, all of which are coupled with the surrounding air. Transitions between the states are handled in a mass and momentum conserving way.

Motor boat. Example frame of a boat wake simulation that uses the Loki state machine approach (top) is shown with a breakdown of the water states (bottom). Bulk water is shown in blue, spray in green, and mist in red. Image copyright © Wētā FX.

In practice, we start with basic FLIP water–we call it bulk water–which represents “solid” volumetric fluid. As it gets in contact with air, the interplay between Reynolds stress and surface tension may cause it to shatter into droplets, which we call spray. The spray is simulated using SPH (smoothed-particle hydrodynamics) since we still want the droplets to interact to capture “tendrily” surface tension effects. The spray further transitions into even smaller mist particles, which are basically ballistic particles but fully coupled with the surrounding air via a drag force, which is quite crucial, because at that scale air has a great effect on the dynamics of water.

One of the best examples of using the Loki state machine is the shot of Payakan and Lo’ak playing together in the water. In that shot there are multiple solvers running in tandem. There is bulk water. There is SPH spray. There are ballistic mist particles. All coupled with the surrounding air and completed in a single simulation pass. So even though there are multiple simulation layers, they are all computed simultaneously with proper transitions and interactions between them. And that’s partly why the shot looks so natural and realistic. I do believe that the foam and the underwater bubbles for that shot were done in a separate pass, but all of the airborne components were completed within a single Loki solve.

Bubbles, coupling and more

We have discussed airborne water in the air, but similarly air can get submerged in the water, creating bubbles. Those get simulated with our Loki secondaries system. The secondaries technique originated from an idea that even though you would think that the dynamics of bubbles may be somewhat decoupled from the water, it really isn’t. There are certain interaction effects you may be after.

Rocky shore. A secondary simulation of foam (white) and bubble (blue) on a pre-cached fluid simulation (not shown). Image copyright © Wētā FX.

In order to capture realistic bubble-water interaction we would actually re-simulate portions of bulk water to ensure accurate coupling with the bubbles. This allowed us to capture so-called collective effects, such as a group of bubbles rising faster than each individual bubble inside of that group, and it turned out to make a significant difference visually.

As the bubbles rise to the surface, they transition into SPH foam.

For any assets submerged into the water: cloth, hair, plants, even though you may not see the water surface, we would run a fully coupled fluid solve to make sure we get a proper response from the surrounding environment. For instance, the Creature department would heavily rely on Loki for these kinds of interactions.

There is not necessarily much new to our bulk water FLIP solver, because FLIP solvers have been around for a while. However, we have incorporated narrow-band, spatial adaptivity and distribution over MPI to be able to handle massive high resolution simulations.

Loki integration

Loki has been integrated into both SideFX Houdini and Autodesk Maya through the use of special bridge nodes that would allow the users to bring data in and out of Loki. It can also run as a standalone product which is especially useful for multi-machine distributed applications.

Data flow between Loki and the host application is via a live bridge (represented as a plugin in the respective host), which handles the conversion between host and Loki data types. Image copyright © Wētā FX.

Loki’s RenderGraph

Loki’s RenderGraph allows you to bring all the different water components: spray, mist, bubbles, volumes, together and composite them all without compromising the quality or fidelity of individual elements. The result is passed over to Manuka, our in-house renderer, to produce the final image.

Residual wetness

When you see characters standing in the rain with tiny droplets on their faces, slowly evolving and forming rivulets, that is our residual wetness system at work. The residual wetness was actually developed by the Look Dev team outside of Loki. When the characters are not too wet and the droplets are mostly sitting in place, there isn’t much reason to simulate it all because it is going to be expensive and really, you may want to be very particular about how the droplets evolve. So it was more of an art-directed rather than a physical solution. Yet, this extra layer added an unbelievable level of realism.

The major challenge of dripping characters

For characters getting out of the water, we needed to create realistically looking drips shedding off of their skin and clothing. We were specifically after capturing fine thin sheets and tendrils of liquid driven by surface tension forces. The solution we came up with was simulation-based and mimicked what has been done in the previously mentioned Alita: Battle Angel shot.

Thin film: Example frame of a thin film simulation on a skin surface. Image copyright © Wētā FX.

Under the hood, it is essentially a FLIP simulation, but a very expensive one, because those needed to be run at the resolution of a fraction of millimeter to achieve the desired level of detail. So if you take a single FLIP particle, it would represent say a tenth or a fifth of a millimeter bit of fluid; and so, that was of course a computational challenge due to the sheer number of particles we had to deal with. Also, the presence of surface tension forces introduced a severe timestep restriction. We would need to run our simulations with 10-20 substeps per frame (at 48fps), which led to extremely long simulation times, in some cases up to multiple days.

Another unforeseen challenge that had to be addressed was making droplets stick to character surfaces. We had to ensure the contact angle computation and viscosity treatment was done accurately, as otherwise the water would easily fall off, which is of course undesirable. We have published a short paper about this in 2019 but I suppose people may have not connected it to Avatar: The Way of Water, because it happened a few years before. In addition the characters as kinematic colliders and their velocities had to be prepared in a very specific way so that the water wouldn’t get disconnected and follow the surfaces precisely. We ended up building a whole set of Houdini HDAs around our FLIP solver to guarantee that the inputs were generated accordingly.

Lo’ak (Britain Dalton) in 20th Century Studios’ AVATAR: THE WAY OF WATER. Photo courtesy of 20th Century Studios. © 2022 20th Century Studios. All Rights Reserved.

For the Alita shot, the thin film water only covered her face, and even then we already had to simulate close to 10 million FLIP particles. Now, imagine covering the whole body of a character or covering several characters with water. Those particle counts easily reach hundreds of millions and the compute times go through the roof. So we had to be extremely careful about how and where we emit and cull those particles to keep the complexity of our simulations under control.

As a character would leave the water, we detected where the character’s mesh crossed the water surface and we would “glue” new particles to the emerging parts of their bodies. And as soon as the particles get back to the main body water, we would remove them from the sim. This approach was quite efficient at the end of the day, and we were able to simulate thin film water on multiple characters within a single shot.

Even so, many of our thin film water sims were painfully time consuming. One of the most expensive sims in the entire movie was in the shot where Na’vi kids fight on a beach in Metkayina village. The water interaction has been simulated for every character, and a single iteration would take 8 days to complete.

Despite long simulation times, our thin film solver has been used in nearly every shot with wet Na’vi characters, providing slow-but-steady solutions in some of the most difficult and dynamic scenarios, and leading to outstanding visual results.

A Tulkun in 20th Century Studios’ AVATAR: THE WAY OF WATER. Photo courtesy of 20th Century Studios. © 2022 20th Century Studios. All Rights Reserved.


Loki: a unified multiphysics simulation framework for production –

Underwater bubbles and coupling –

A practical guide to thin film and drips simulation –

Guided bubbles and wet foam for realistic whitewater simulation –

Wave curves: simulating lagrangian water waves on dynamically deforming surfaces –

Leave a Reply