A number of the opening space scenes, arrival on Pandora sequences, and views of Bridgehead were ILM visual effects shots.
In mid-2022, as James Cameron’s Avatar: The Way of Water was taking final shape for its December release date, the filmmakers realised that the enormous task of bringing complex worlds required some additional VFX firepower.
That’s when they brought in Industrial Light & Magic and visual effects supervisor David Vickery to complete a number of sequences, most of which had had significant asset development already done by Wētā FX, and had been subject to Cameron and his Lightstorm Entertainment’s rigorous ‘templating’.
The 48 shots that ILM took on, largely centred on the opening moments of the film and the humans return to Pandora. The very first shot – a dreamlike swooping camera flying across the tops of the Pandoran forests, the floating mountains of Pandora, ships in outer space entering Pandora’s atmosphere and their destructive touchdown on the planets’ surface, and scenes of the characters arriving at a new Pandoran outpost called Bridgehead.
The shots were effectively ‘one-offs’. In the world of visual effects shots, ‘one-offs’ can be a challenging part of the deliverables for VFX studios. They refer to single (or very few) shots where a complicated creature, landscape or FX set-up needs to be built, but where all that work only appears in that single shot, or in very few shots, rather than being spread over a number of sequences.
Luckily, Industrial Light & Magic had the benefit of Wētā FX already having made incredibly detailed assets, as well as the benefit of a brand new standardised workflow for ingesting and sharing between studios, known as CAP (Common Asset Package). ILM had also previously worked on Cameron’s original Avatar to help bring that film to life in its final stages.
In this befores & afters interview, David Vickery breaks down a number of the scenes ILM delivered, unarchiving old ILM scenes from 2009, how his studio and Wētā FX collaborated together (including via the new standardized CAP workflow), and how he also discussed scenes with Cameron and other filmmakers on the movie.
b&a: 48 shots doesn’t sound like too many, but those shots are incredibly detailed. Where did you get started?
David Vickery: It sounds like a small body of work. But we joined the project very late in the game, the goal being to give Wētā FX a little bit more bandwidth to finish their existing body of work. What was astonishing for me as we came onto the project was the sheer volume of assets that Wētā had built. Wētā had almost entirely completed their asset build. Everything was ready, bar the detail and actually putting those assets through their paces in the context of the shots.
The full asset list came through from Wētā in July 2022, and for 48 shots, there were 625 unique assets. That list didn’t include any of the landscape and terrain and was missing many of the more standard ‘earthlike’ variations of trees and foliage. The 625 assets were also just the individual variants. It didn’t take into account the many thousands upon thousands of instances of all of those assets that had been scattered through the landscape in the templates for the shots. There had been such an incredible attention to detail from the production designers for each and every shot that then fed into Wētā and their asset team led by Marco Revelant.
b&a: So you’re receiving those assets from Wētā, how does that process work?
David Vickery: The first hurdle is how do you actually ingest that much data? How do you then match the lookdev on all of that? You look at every single individual asset and you think, ‘That might take a day to ingest that and get the shaders set up and working and match the look and run out a turntable and then put it into dailies and review it.’ Well, that’s 625 days of modelling time!
Luckily there had already been some work between ILM and Wētā FX for a different show to develop a standard asset sharing package, a Common Asset Package that we call CAP, which essentially takes all of an individual assets textures and shaders and bakes them down into a standardised surface shader. Allowing us to ingest the data and match the look development pretty quickly without having to rewire all the shaders.
Wētā FX would render their assets using their own internal shaders, bake those assets and textures down and re-render a second turntable using the standardised common asset package format. They would then share their assets look-dev environment with us so we could then ingest, plug it all back in, pipe it into ILM’s system, render it, and have a match.
b&a: Was CAP being developed for a different project or just for other projects where you’d be sharing things?
David Vickery: Obviously facilities nowadays are always sharing assets. Marvel has done a brilliant job in sharing work from company to company, bringing 10 different facilities together but still achieving a common look across shared assets with those facilities. But we really put a lot of time and effort into that sharing pipeline on The Rings of Power where ILM and Wētā were already working together and sharing a lot of assets.
b&a: Of course, just being able to ingest doesn’t mean it just ‘works’. I’m sure there was a ton of work involved in the next stage?
David Vickery: Yes, every single shot in The Way of Water was incredibly complex. To give you an example, we were working on one shot with a Thanator and a number of other creatures charging through the forest and the jungle behind them is burning. The animals are being chased by a huge rolling wave of fire and destruction that’s razing the landscape. I looked at the template QT of the shot for our bid and could visually count around 12 hero creatures in it and thought maybe there would be another 20 hidden throughout the forest. We got the scene file from Lightstorm and Wētā and stripped it all out and counted 187 creatures in the shot! So much information.
b&a: Oh my gosh.
David Vickery: I have to say, The entire team at Wētā were incredible to work with. They were so generous with their time. Eric Saindon their VFX supervisor, Lena Scanlan their producer, Marco Revelant their asset supervisor – they had so much time for us, helping us get up to speed, sending us asset packages and scene files and stripping scenes back so that we could reassemble them accurately back at ILM. They were just brilliant. It was such a pleasure to work with them.
I was also astonished by the level of detail and care taken over the template scenes that we were given by Lightstorm. It’s a really interesting process but incredibly logical if you have the time to do it and you definitely can’t call it previs, even though looks like what you would normally call previs. My experience with previs to date has been that it’s treated very much like a jumping off point. It’s an exploration in early pre-production to help block a scene. It’s a development stage of a storyboard and it’s subject to change and evolve throughout production. You’ll likely shoot something that’s slightly different and then you go into postvis and the postvis changes yet more things, introduces new ideas, steps the sequence up another level and evolves it further. And then when you do your finished version of the movie, it might change again. Evolving as the edit and ideas for the sequence evolve.
Well, the templates we got for this film were very different from your usual previs. The Lightstorm visual effects supervisor Richard Baneham said, ‘The template is right until it’s wrong.’ For example, we had ZERO handles on any of our shots for the movie. That, alone is testament to how confident and committed James Cameron and his team were to those templates. The template really was the Bible and the goal was to match them in every way – where the shadows lie on the ground, where the ships are in the background. I started off thinking, ‘Well, what are we actually going to do? Just make the template look real?’ For a while I felt that those templates were going to really creatively handcuff us.
But then I realised that this wasn’t the case at all, what was going on here was Jim Cameron and Richie had already made their movie, and essentially, what they were giving us was the world’s most complex version of paint by numbers. The difference here was that they weren’t asking a five-year-old to do the painting. They were asking Industrial Light & Magic to do it, staffed with some of the best visual creative artists in the world. The templates had given us a beautiful guide for how the movie needed to play visually and emotionally and ILM were being asked to do the best that we could do and find a way to make it even more amazing. That said, Jim Cameron always wanted to hear ideas and ways that we could ‘plus the shots out.’ We became a trusted creative partner in making his film because we listened to him and respected the story that those templates were telling.
I think that’s one of the reasons why we’ve been able to make Avatar: The Way of Water look so great is because they had a director and a production designer and a visual effects supervisor in lockstep who knew exactly what they wanted and didn’t change their minds. All the time, energy and effort we put into the post production was focused on making those templates as good as they could be. Not exploring different ideas, or presenting arrays of different options. It was a pretty amazing experience.
b&a: What kind of contact did you have with Jim Cameron?
David Vickery: It was so great; lots of collaboration with Jim. I had one call at the beginning with Wētā’s Eric Saindon, and then we dealt directly with Wētā for the asset ingest and then I had all my creative shot reviews initially with Richard Baneham. Richie shot the movie with Jim, so he was really dialled into the story of each and every shot. He’s an animator at heart, and was able to provide us with a lot of great insight into the Ampsuit rigs and designs and also the creatures.
We’d then have a separate meeting with producer Jon Landau to show him the same set of work. What was interesting was that he would have a slightly different perspective when looking at the shots. He was keen to make sure we had matched the templates, ‘Does it look like the template? Is there anything out of position here that doesn’t line up and is going to cause Jim to go, ‘Hey, what’s going on? Why did you change my shot?’. Jim was so invested in the templates, that he spent time placing lights in all the shots. So if we moved one and it changed the intent of the shot (whether we did that deliberately or not), it was essentially the same thing as someone moving a light during a practical shoot without asking the DOP or director. You just wouldn’t do that.
After we’d spoken with Jon, we’d have another meeting with production designers Dylan Cole and Ben Procter. Their focus was slightly different again. Richard looked at the animation and overall shots, Jon would look at how closely we were matching the storytelling of the template, and then Ben and Dylan were looking at the assets and the storytelling of those assets and the world that they’ve built. Was the level of wear and tear in keeping with how long the structures had existed on Pandora? Do the construction materials we’ve used make sense? Does the layout of the world stay in keeping with ideas they have for Avatar 3, 4 and 5?
We’d then have a review with Jim. It was pretty challenging from a timezone perspective. We’d be doing those reviews at 6am in the morning in the U.K. and it would be 6pm in New Zealand for Jim and we’d just present directly to him.
The first review that we had with Jim was a turnover call. Bear in mind that we only had 48 shots, we had three hours of Jim’s time just going through those 48 shots. That’s how generous he was with his time. He was very specific and spoke a lot about light, emotion and storytelling.
The other really refreshing part of the process was that Jim would creatively approve our shots and not be phased if they were low quality renders or there were hair edges that still needed work. He would simply say, ‘Well, I know you’re going to fix that. Clearly, that’s not going in my film, soI don’t need to see it again.’ So he would approve a shot and go, ‘Yep. Do your final renders.’ Then we shipped all those final renders back to Walter Garcia at Lightstorm and then Jim reviewed it reel by reel and signed it all off.
b&a: I wanted to ask you about the opening shot with the foggy forest opening because when I was watching the film it reminded me, of course, of the opening shot of the first film which I remember ILM’s John Knoll talking about at a conference years ago. It’s fun to think the same VFX studio was behind that.
David Vickery: Actually, that was quite scary for me and the whole team at ILM in London. That’s an extra layer of pressure just knowing the fact that somebody of the calibre of John Knoll had done this before and that we were trying to step up and into his shoes. That’s a good shot of adrenaline if you’re ever going to get one!
We obviously had the opening shot from the first Avatar as reference, that was our ‘template’ for the shot. On the original Avatar the shot was built a lot of 2D smoke and fog elements stacked up in depth on cards. There was a 3d camera move flying through them all which emerges out the other side onto a fully 3D rendered environment of the forest.
Richie Baneham had rewatched the shot from the first movie again and again and felt he could see echoes of the canopies of trees in these negative spaces around all the fog and smoke elements. I’m not clear whether that was deliberate or just the way those 2D elements came together, but his goal for this film was to re-work the shot, but this time using full volumetric 3D simulations for all the clouds. Those sims could then be held out accurately by a fully digital render of the Pandoran tree tops at the bottom of frame.
The shot is supposed to put you in this dreamlike state where you think you’re in the clouds. Like a memory of what Pandora looks like slowly returning to us as we watch the shot evolve. You start to catch glimpses of the tree canopies beneath camera and just when you think the shot is going to open up and you are going to arrive in Pandora, it fades back down again and hard cuts to the floating mountains. It’s really effective as you get this emotional hammer blow with the floating mountain shot supported by the building score of the music. It’s Cameron’s way of saying, ‘We’re going to the movies. We’re going to Pandora. This is not reality but you’re going to live it and breathe it for the next 3 hours.’
b&a: How did you realize those openers, in the end?
David Vickery: All the volumetric sims and the renders were done by ILM. For the silhouettes of the canopies underneath, we got a bunch of Pandoran tree assets from Wētā. A lot of the trees on the Pandoran landscape are very similar to trees that you get on Earth, but we had to find the right types of shapes of structures so we could use our own library of trees to generate that landscape.
For the second shot, we unarchived the original floating mountain shot from 2009.
b&a: You did?
David Vickery: Yeah, the whole thing. Every single bit of it. And it is really fun to look at how that shot was executed, just to go back and look at how we were doing that stuff in 2009. The shot is built from matte paintings projected onto simple geometry with elements of 2D smoke and fog. It’s wonderful to see how ILM put the shot together back then. We had to make it look exactly the same (with a few minor tweaks and some extra Ikran) but at 4K 48fps and bring it into the 2020s.
b&a: What about those ship shots in space? How were those done?
David Vickery: As with all the shots, they started with a set of templates from Lightstorm. From what I could tell, many of the shots were actually shot by Jim with the virtual camera. Richie would set up the Motion Builder scenes at Lightstorm and then go and shoot out the sequence and then Jim would step in and often just take the camera himself and shoot it.
For the Manifest Destiny shots, the ships themselves were actually completely static and it was the camera doing all the motion, because it’s Jim on a stage with the camera just walking past the ships.
In the first meeting, I looked at those huge ships and they had those massive reflective mirrors on the front, and in my ignorance (or my idiocy), I referred to them as solar cells. They’re definitely not solar cells. Jim was like, ‘They’re not solar cells. They’re heat shields. They’re heat shields protecting the ship and it’s anti-matter annihilation engine because it’s decelerating from 0.6 c.’ Jim knows his stuff. He really cares about the science being right.
The shots also required us to remake Polyphemus and Pandora. The big emphasis for us here was looking at NASA photography of Earth and Jupiter. We weren’t actually using NASA photography but we could split NASA photography into our own images and go, ‘Okay, yeah, yeah, what we’re doing makes sense. We’ve got this atmospheric bleed around the edge of the planet correct and the terminator line makes sense and the scatter of light through the clouds is making sense.’ And then we could remove the NASA photography and know that we’ve got a faithful image. Jim said, ‘If you need any more stuff, I can probably get that for you. I know a few guys who used to be on the board at NASA. They can get you that stuff.’
For the sequence where Quaritch arrives at Bridgehead and the Garuda ship is landing, I remember Jim talking to us about the engine thrusters. They look a bit like a harrier jet Jim said, ‘If you want to work out how much energy those things are dissipating, you just need to take the weight of the ship, divide it by four because there are four engines and then you can work out the pounds per square inch of thrust vectoring that’s going on in the ground right beneath it and that you can use that to drive your simulations.’ It’s just an amazing process that he is going through because he wants it to be physically plausible and accurate.
b&a: When the ships are landing and basically causing that fireball, how did ILM approach that side of things?
David Vickery: Jim described it as like the hand of God reaching down through the clouds and just tearing up the land. There was some reference that we could lean into, such as Saturn V night launches in terms of exposure of light for the shots when the Destiny’s coming down through the clouds and starts to burn the landscape.
We had incredible effects artists for the entire of our post production looking at each one of those shots. Lead FXTD David Kirchner looked after the shots with the clouds parting and the big pyroclastic-like flow, with the plumes of smoke and ice and ash that are ripped up by the ship as it lands. Miguel Perez Senent, one of our CG supervisors, looked after the shot of the home trees being destroyed as the huge rolling fireball comes at it.
It was all Houdini-based simulations taking reference either from Cameron’s previous films like the big nuclear blast in Terminator 2, or even Armageddon with its huge explosions. So much of it was about emotion. Cameron would often go and sit closer to the cinema screen in our reviews so he could ‘feel the burn’
Using light was really important as well. There’s this cold, clinical light of the humans and their machines coming down counterpointing with the warmth of the light of the fire. It was all about how we could really key into the emotion of the scene and make people feel like these humans were invading their space and just ripping it into pieces without any cause or care.
b&a: Can you talk about what ILM had to do for Bridgehead?
David Vickery: It’s only six or seven shots but the assets for that sequence were insane, (in a good way) Our CG Supervisor, Steve Ellis did an amazing job bringing those scenes to life. There were 15 different types of ground transport vehicles of which there are hundreds of them on the floor, sparks, dust, bulldozers, cranes and diggers, humans driving ampsuits, construction workers, flying craft like the Sea-Wasps, Dragons and blimps, hundreds of welder assembly bots swarming over the buildings all moving and spot welding and arcing. It was all about how we could add detail to breathe life into this huge space.
There were bits of set dressing in the deeper background of a shot and Ben Procter would say, ‘Oh, that’s really important for Avatar 3. We need to make sure we get that right because there’s a massive thing going on over there in Avatar 3.’
There’s huge diggers that are churning up sandy coloured dust. There’s factories in the background pumping out air conditioning steam and smoke and burning fossil fuels. There’s desert winds lifting sand and blowing that through. We’d talk about, ‘How can we build up and diffuse the environment to get the right color of light contamination and shadow pollution through to make it feel like a real thing?’
Interestingly, our texture lead Mark Young worked on the Valkyrie texturing at Wētā on the first Avatar. The first asset we gave him was the ship that is actually only a slight modification to that Valkyrie asset that they travel in as they’re going over to Bridgehead. He’s like, ‘Hey, I did this on Avatar and now I’m doing it again….’. But he did such a great job.
b&a: Did you need to do any particular cross-overs with Wētā FX’s shots at all?
David Vickery: There’s a shot where we cut back and forth between a wide ILM shot featuring the burning Pandoran forest and a Wētā shot, which is a close-up reverse on Neytiri and Jake, when she’s crying. The characters in our shot are from Wētā and the distant burn is all ILM. Wētā provided us with a foreground layer with the creatures and we did all of the deeper background and then the subsequent POV of the burning firewall. If there were Na’vi in the shot, Wētā created them but then they would provide us with them as a render that we would then comp.
b&a: How did you deal with any of the high frame rate aspects of the work?
David Vickery: We’d heard that Jim was upgrading shots to 4K or he’d say, ‘Okay, we need this one to be at 48fps now.’ The default for us was to assume 48fps and then we could drop frames to deliver 24fps media to the AVID and review it with Jim.
However, rendering at 48fps, you don’t want a 180 degree shutter. So we actually set our shutter to 172.8 purely because that’s what the running film cameras are when they’re trying to get flicker free. If we knew that we were going to render the shot at 48 and deliver at 48, we would just halve the shutter.
Often, however, we found that due to the crisp clarity and slow motion of objects, we needed this to render with even shorter shutters. It was just a case of looking at the shot full res on a big screen and deciding whether or not we needed to arbitrarily change the shutter just for the shot to make sure that you got the detail.
b&a: Did stereo prove challenging?
David Vickery: We received stereo camera rigs from Lightstorm and what I really applauded about the process was that there wasn’t a tendency to try and adjust the interaxial (or interocular) from shot to shot to try and make things feel more ‘stereo’. That could have quickly had the effect of miniaturising objects. The space shots are huge, for instance. Those ships are three or four kilometers long, so there’s negligible stereo deviation between the left and right eye when you have a standard human interocular. And they were like, ‘That’s fine. It is what it is.’
We used a fairly standard 2.65 inch interocular separation. There were a couple of instances where that varied but it was only on the shots that were predominantly shot live action, the shots of Spider in the interrogation room for example. In those instances they made a call on the day to change the camera setup.
b&a: This is all really fascinating because it reminds me of discussions I’ve had with VFX supervisors about ‘one-off’ shots in films, but this feels like an intense version of that.
David Vickery: They’re all one-offs. Every single one of our 48 shots. That was the real challenge on the show. We had to try and set up a pipeline to ingest 625 assets and create 48 huge one-off shots. Every single shot was completely unique in its own right, whether that’s because of the FX or because of the creatures that were in it, or the landscape. But you’re right, they were one-offs. 48 of them.
Become a befores & afters Patreon for bonus VFX content