The VFX supervisor for ‘Top Gun: Maverick’ breaks down the biggest visual effects challenges in the film, and some of the invisible effects surprises from the team.
Joseph Kosinski’s Top Gun: Maverick is nominated for the Best Visual Effects Oscar at the 95th Academy Awards. The listed nominees for the film are Ryan Tudhope (production visual effects supervisor), Seth Hill (visual effects supervisor, Method Studios), Bryan Litson (visual effects supervisor, MPC) and Scott R. Fisher (special effects supervisor).
The majority of the more than 2000 visual effects shots were accomplished by Method Studios (now Framestore), with MPC and an in-house team also tackling many scenes. Amazingly, the VFX work was completed in 2020 for a release that year until, well, COVID. When the film was finally released in 2022 it of course became a phenomenon.
The aerial scenes, in particular, wow’d audiences, not only for their dynamism and the interplay between the characters (new Top Gun pilots and the veteran, Maverick, played by Tom Cruise), but also because audiences responded to the feeling that so much of the film had been shot for real, with actors in cockpits flying in real planes.
This impressive practical side to the filming was certainly true. It was made possible via a close collaboration between the US defense forces and production, and via DOP Claudio Miranda’s installation of a bank of 6K Sony Venice digital cinema cameras in front of the actors–who sat in the back seat–in F/A-18s used during filming, the idea being to capture plates of the actor in the cockpit from multiple angles and also doubling as a way to capture real environments.
Furthermore, a significant effort was made in photographing real aircraft, like the F/A-18s, in the air. Oftentimes this was not the aircraft that would be in the final frame, but instead it might be a stand-in or it might be the jet that was used for filming aerial plates itself. In addition, cameras would be hard-mounted on the outside of planes for acquiring footage.
However, given the number of planes required for the story, the insane maneuvers many of them make, and the fact that some of them simply don’t exist or were not in operation–including the hypersonic Darkstar, the F-14 Tomcat, and the Su-57s–visual effects and CG animation were always going to be part of the storytelling in Top Gun: Maverick.
The visual effects team would, after meticulous tracking and attention to lighting detail, ‘re-skin’ the stand-in planes with the necessary ones, or animate CG ones from scratch for certain scenes. It would also utilize additional programmable cockpit gimbal footage to help craft the final shots. The VFX detail went right down even to the ‘micro-scratches’ inside cockpit canopies that reacted to shifting sunlight positions, and the implementation of hand-animated motion blur to aircraft shots.
In addition, the unnamed nuclear facility the crew attacks, and other locations, would need to be computer-generated environments, while explosions, missiles, missile trails, destruction and additional FX were also synthetic.
Still, as you’ll find out in this befores & afters conversation with production visual effects supervisor Ryan Tudhope, who hails from Framestore, all the VFX work would be informed from what could be captured for real, and often in consultation with actual Navy pilots.
Here, Tudhope breaks down some of the major components of putting together the visual effects for Top Gun: Maverick, especially related to the CG jets and environments. This includes a range of supporting and invisible effects work, such as dealing with visors, jet formations and armaments, and some secrets about the sailing sequence featuring Tom Cruise and Jennifer Connelly’s characters.
Photo-scanning real jets
Ryan Tudhope: It was really important to Joe Kosinski and I that everything we built would be based on real photographic reference–we literally were trying to recreate digital clones of them. We were up close and personal with these aircraft, so we’d be walking down the length of them, with your hand on them, seeing all the imperfections and taking reference images of how the light moves across different things and just getting familiar with all the little dents and scratches.
For the F/A-18s, we were actually able to get the Navy to place one on the tarmac where we could set up our camera rigs and our light spheres and we got them to rotate it 20 degrees, then we’d take photos, they rotated the entire jet another 20 degrees, and we took more photos. So we literally turn-tabled real jets on the tarmac in the same lighting so that we could match that digitally in our system. It seems crazy in hindsight asking someone to rotate an F-18 for you, but it was an opportunity we couldn’t pass up.
The L-39 stand-ins (and painting them gray)
We had been utilizing L-39s which are basically stunt and trainer aircraft, for one of the camera platforms that we shot a lot of the F-18 footage with. With the support of our aerial coordinator, Kevin LaRosa Jr., we painted these jets matte gray to provide a great lighting reference, and we covered them in tracking markers. We went up in the air in these various locations, shot air-to-air, tried to get the shot as best we could. The aircraft are not as high performance capable as a fifth generation Su-57, or even the Tomcat, but they are jets and they do behave in a physical way that’s very similar.
Kind of like motion capturing a real jet
So we shot these jets for real, and then you’ve got this jet doing this thing in the air and it’s effectively motion capture. Sometimes the performance is what you want and sometimes it’s got to be worked on by an animator from there. Oftentimes what that required from our team and our animation supervisor, Marc Chu, was modifying that performance in a way that was more interesting for the shot, but still honored that inherent underlying motion and didn’t change the foundation of what we were doing.
One thing we had to learn is how every little flap moves, how flaps work together, and how the air brakes work. We all had to get up to speed on what should happen when a jet does a single engine stall and a pirouette move in the air and starts to come down. Those shots are so convincing, I think, because the team really grabbed that and tried to make sure that every little detail, every little flap movement, was appropriate.
We had amazing support from the naval aviators by looping them into that animation process. They would come in to see how things were going and we would show them gray shaded animation takes and say, ‘Does this feel right to you?’ Sometimes I’d have five or six pilots all sitting around the table and they’d be like, ‘Yeah, well, I think maybe just a little more pushed over. There’s something that just doesn’t feel right about that.’ There were so many people who were willing to contribute and help us get it to the point where it was hopefully seamless and people don’t question that there’s a Tomcat doing a maneuver like that.
Re-skinning planes with CG models/textures utilized GPS tracking data
One of the big challenges was that, especially towards the end of the film, we filmed in really remote locations out in the Cascades and out in Nevada. Now, not every single shot could be entirely bespoke in terms of its light rig set up and all the various things, so the team had to figure out ways to try to make sense of the time and place of when things were filmed.
So we relied on the GPS data we had from our various camera platforms, USGS data, LiDAR and photogrammetry. We didn’t have LiDAR of the Cascades, but we had photogrammetry that one of our VFX team members, Devin Breese, shot while in our safety helicopter that was standing by during filming.
It was really about trying to pull all of that data together and recreate effectively what had been done for real so that our light rigs would accurately represent the lighting on the day. We also used the data to properly track our camera jet motion, the subject jets, and the terrain, to ensure any added effects, explosions, etc. would behave in a physically accurate way.
Jet additions: bombs
When you put real bombs and missiles–or the weight of them–on the wings of an F-18, the computer systems will adjust the performance characteristics of the jet so that it can handle that extra weight. It won’t let you do particular maneuvers as well as you could with clean wings, which, when you’re making a Top Gun movie, is not what you want. You want the jet working as best as it can, doing the best it can, from a performance standpoint.
So a lot of the armaments were added in. That was a prime example of where supporting visual effects could come into play in a really simple sense, although simple is all relative because you’ve got all the vibration on the wings and everything that’s moving, and the lighting that’s swinging around and moving. As soon as you’re sticking digital things onto wings in right in front of your face, it’s just got to look right, it just has to be convincing, so it was very meticulous work.
The art of jet formations
There’s a number of scenes where our characters needed to be up in the air with a formation of jets around them. A lot of the shots where jets are seen out the window would be something that we could coordinate and get it all for real. But it just turned out that it really added another layer of complexity and strain on resources to be able to do all that.
We determined early on through various tests that adding other jets in formation out the canopy windows was something we could do convincingly in VFX. That allowed the team to really focus on prioritizing the actors in those moments and getting the performances they needed, rather than the performance of the pilots outside the window, which is exponentially more complicated. I think that’s another really great use of supporting visual effects where you can give the filmmakers the ability to focus on what’s most important in a given shot and we can help take it the rest of the way.
Programmable cockpit gimbals provided special shots and specific set-ups
Our special effects supervisor Scott R. Fisher built these amazing cockpit gimbals and detailed cockpits for the F-18s and the Tomcat and the Darkstar. That work was really focused on very specifically the things that we knew we couldn’t get for real in the air, such as shots where you’re featuring both the pilot and the back seater. Or it could be for dramatic moments where we really wanted to be able to get particular important shots, or just shots that were too difficult to get in the air.
We were able to program it with real motion tracked information from the real F-18s and get it to have a really convincing motion to it. We shot these outdoors, for the F-18s and Tomcat. We explored shooting them inside on a stage with a light bubble wrapped around but ultimately there wasn’t enough light to get that same small aperture look that we had established in our aerial photography.
For the Darkstar, which was a full scale model of the cockpit mounted on a gimbal, we were able to use some of the footage shot from inside cockpits which we sometimes ran with no actor, and place the footage on some LED screens for lighting reference. Eventually it becomes a digital environment, of course, because they’re going higher than we could really take our aircraft.
The Darkstar also had this plasma moment, which was this super-heated material that starts forming on the outside of the skin, right outside the window. That was an important element for us to develop early on so that we knew what the light on Maverick’s face was going to be for when we filmed in this gimbal.
The challenge of visors
One of the things that we ran into when we were doing our gimbal work on the ground is that their visors were reflecting everything. So we ended up just pulling their visors off and then we did digital visors tracked onto the helmets. It’s something that was required on quite a few shots that you just wouldn’t even think about or know before you get into it.
FX: explosions, missiles, flares, vapor cones and more
We had an amazing FX team on this. For explosions, say, they took what we gave them and ran with it in terms of getting these explosions to feel very big in 3D, and to feel like it goes off and then recedes into the distance, but it’s still churning and moving and you’re moving through them.
Those kinds of effects simulations are super fun. There are really two elements to them. There’s the more editorial/high energy side of it, which is owed to Joe, of course, and our editor, Eddie Hamilton. We all wanted to create a greater and greater sense of danger of smoke trails building up as the battle sequence is unfolding. There was a major design element from a sequence-wide point of view about deciding where all the smoke trails were going to be, and we’d be drawing them and sketching all that out.
Crafting a world
Like the rest of the visual effects in the film, there was always something real that we were basing our decisions on for environments. If you take the environment at the end of the sequence with the enemy base at the bottom of the bowl, for example, we spent a long time scouting in a helicopter, various locations in the Cascades mountain range in Washington, looking for a suitable environment to stage those sequences. What we found was half of what we wanted.
It was this half of a huge valley and a bowl at the bottom, but the backside of that location was completely open and became this other valley that went along. We ended up closing off digitally the other end of that bowl so that it fit what was required of the story, which is that they come up over this ridge and there’s this literally 360 degrees spherical bowl that they have to go down and bomb and then fly up the other side.
That was a good example of a location where our augmentation played into something that we had found practically. Of course what that also gave us was great reference. We were able to hose down that particular environment with textures and photogrammetry and utilize a lot of that as we designed the other side of our environment.
In other cases, it was simpler work, more in terms of getting the continuity between environments to feel correct because we might have shot it over the course of multiple seasons in the desert where sometimes there’s snow and sometimes there wasn’t. It was all really in the spirit of starting with something real and trying to extend that in a seamless and supporting way.
Even the enemy airfield, we worked really closely with production designer Jeremy Hindle, to determine what we were going to be capable of building practically and where that blend was going to be. It was really just like any project where you’re trying to find that place where you go as far as you can practically and then you carry it the rest of the way.
When Tom and Jennifer went sailing
For this sequence they went up to San Francisco to film the scene, which was meant to be San Diego, so there were very simple augmentations to the background to take out certain landmarks. There was no digital water, that was all real. But there was a safety component–there had to be someone in the boat helping them who we removed in some shots when they were visible. It was really complex paint work obviously to do that, but what you’re seeing is as impressive as it looks.
And yes we did have to replace the color of Jennifer Connelly’s pants. That was more of a continuity issue, where we shot it early on with one costume, something changed in the edit, and then for that scene she was wearing the wrong colored pants. We had to change the color of them. Honestly, when this came up, there were much bigger things on our minds, so that was the least of our worries. When that shot came up, I was like, ‘Sure, what color do they need to be?’