Behind two key Wētā FX tech developments for fire and character deformation on Avatar: Fire and Ash.
When James Cameron’s Avatar: Fire and Ash was recognized with the Academy Award for Best Visual Effects this past week, it represented the culmination of years of research and development and artistry from Wētā FX.
On this latest movie, and in the two previous Avatar films—which also both received Best Visual Effects Oscars—the visual effects studio continued to advance the art in the areas of performance capture, computer generated characters, and water and fire simulations.
Ultimately, Wētā FX was responsible for 3,132 visual effects shots in Avatar: Fire and Ash, and in the process generated some astronomical numbers in terms of realizing the final imagery: the studio used 1,248,087,308 hours to render the final shots, which would have taken 142,000 years on a single processor. Meanwhile, Wētā FX’s work on the film used a total disk space of 140 petabytes (140,000 terabytes), and that was generated at a rate of roughly 200-250 terabytes of data per day.
Supporting that rendering workload were AMD EPYC processors and the Lenovo ThinkStation P8, powered by AMD Ryzen Threadripper PRO. “We began our AMD journey a number of years ago now,” outlines Wētā FX Chief Technology Officer Kimball Thurston. “For Fire and Ash, it was also more about, how do we better leverage the many, many cores that the AMD processors have to better do more efficient multi-threading and make renders faster and get turnaround to artists faster? From a science and technology standpoint, that was a big push for this film.”
A breakthrough on fire
One of the key technological innovations from Wētā FX on Fire and Ash was Kora. This is the studio’s new toolset for physics-based chemical combustion simulations. Yes, fire! And there’s plenty of it in the film, from flaming arrows and flamethrowers to huge explosions and even fire tornadoes.




Kora was developed to be deliberately artist-friendly, as Wētā FX Lead FX supervisor Nicholas Illingworth explains. “Fire was one of the things that we kept tripping up against on The Way of Water. We spent years developing this chemical combustion toolset, but what we found when it was in production was that it was incredibly hard to use. We really knew that there was power in the solver, but we just needed to make it more accessible.”
So, Wētā FX set out to do exactly that. While still maintaining high-fidelity in the fire simulations, the development of Kora—which was also recently recognized with a Visual Effects Society Emerging Technology Award—allowed artists to input certain fuel types and control whether those types should be premixed or whether there might be an incomplete combustion, soot, and what the balance of fuel and oxygen should be.





Wētā FX’s workflow for simulating fire first involved running low-res sims. Illingworth notes these could take a couple of hours, with high-res sims being run overnight. “There were times where we had to lean on MPI (Message Passing Interface), so we would be using multiple machines to spread out the sim. That would largely be for large scale events like the Factory Ship exploding or the 300m fire tornado. Prior to that, a lot of our simulations are on a single high memory machine.”
“We put a lot of effort into trying to make it more accessible to your more mid to junior FX TDs, rather than having a handful of expert TDs,” continues Illingworth. “The payoff was, we got very little notes from Jim Cameron on our fires on this film, compared to the previous film. Thankfully, we put all of that development up front and we got the payoff and the client was incredibly happy.”
New developments in character deformation
Another landmark technical breakthrough from Wētā FX’s that was orchestrated for Fire and Ash was the creation of a skin deformation system called BodyOpt. Typically, on a creature-heavy film, anatomical features like muscles, tendons and skin would need to be rigged with intricate detail.





This kind of rigging did, of course, occur, but the new BodyOpt system allowed for an efficient way of focusing just on the outer skin, realizing the necessary skin deformations using a mesh neural network that was trained on a curated dataset.
“BodyOpt encodes a whole lot of deformation that’s been captured and writes it out as a BodyOpt data type that we can load very close to real-time in our big pipeline,” details Wētā FX creatures supervisor Tim Forbes. “We also have a solution that can load this for real-time puppet deformations on a proxy resolution level model for fast and performant playback for animations, so you get an idea of what their final deformations are going to look like as well.”
With BodyOpt, the problem Wētā FX was looking to solve was how to achieve consistent, reliable, hero-resolution deformations without paying the price of a dedicated simulation using the studio’s coupled simulation framework, known as Loki.




“The result is just more efficient delivery of shots, and more efficient and reliable skin deformations through into our super complex wardrobe simulation, says Forbes. “In Fire and Ash, there were all these beautiful complex wardrobes. We wanted a stable and reliable input to make sure that we got a good result out of those wardrobe simulations. Especially when they’re running for 24 hours on the render farm , you want to know that you’re going to get a clean result out of the simulation with no issues caused from the input data. BodyOpt does this very well.”
Standout scenes
Asked to pinpoint their favorite moments in the film where Kora and BodyOpt were relied upon, the Wētā FX team identified some thrilling shots. Illingworth mentions a fiery scene with the great leonopteryx after it crawls back onto a rock after crashing, opening its wings and roaring in front of a wealth of flames. “That shot was really fun to work on and showed off a lot of the new tools. There was a lot of thick smoke that had a very high absorption component, passing in front of the fire. There’s just so many different layers of complexity and flames within that one shot.”
For Forbes, meanwhile, just about every scene in the movie utilized BodyOpt’s character deformations, even for background characters. “However, for a more hero shot, I like the sequence where Jake and Neytiri are having an argument. We’d put in a fair bit of time and effort to try and get BodyOpt dynamics working in a more accurate and user-friendly manner for the film. In this sequence, there are some shots where you see the pinging of Jake’s thighs as they jiggle and catch tension in the firelight. That’s where we really got some extra little subtleties into BodyOpt, where we were able to take the detail that we had in this film just a little bit further than we had in the previous one.”

Thurston remarks that he was able as CTO to directly aid in the delivery of several scenes in Fire and Ash. “One of the joys of working in technology in studios is that you occasionally get to help with shows. In the third act battle, when the Ikran and the helicopters are all fighting with each other, I was working on making the crowd system render faster. Then, when you see the Tulkun actually enter the battle, that’s also a great moment, because it’s a blend of all the things we’ve done. Stuff starts blowing up and people start getting eaten. It’s just great.”
Brought to you by AMD:
This article is part of the befores & afters VFX Insight series. If you’d like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.





