The virtual production and visual effects art and tech that brought authenticity to ‘Masters of the Air.’
Flying planes, cockpits and clouds. These were just some of the challenges thrown at DNEG, which acted as the principal VFX vendor on the AppleTV+ series Masters of the Air. The show follows the 100th Bomber Squadron during World War II and sees an enormous amount of filmed action up in the sky, including in dramatic aerial battles.
Nascent virtual production techniques and significant visual effects were required to construct lengthy in-air sequences. All of this work was overseen by production visual effects supervisor Stephen Rosenbaum, with special effects supervisor Neil Corbould handling the plane gimbals shot on LED wall stages, and other practical effects.
Working closely with Rosenbaum was DNEG visual effects supervisor Xavier Bernasconi, who joined the project for the on-set work at the beginning of the virtual production shoot before shepherding several DNEG crews from across the globe with the help of DNEG visual effects producer, Abigail Everard. Some of the most challenging aspects of DNEG’s visual effects included building period-accurate planes, dealing with airplane window views and crew reflections, and manufacturing believable landscapes and clouds.
Here’s a look, with Bernasconi, at the key virtual production and VFX challenges of the show.
Making the show before making the show
Masters of the Air sought to rely on the latest virtual production methods to film cockpit and fuselage scenes for filming during 2021. Many elements had to be worked out for real-time playback, not least of which was generating imagery of clouds, ground planes, the planes themselves, and flack, and then connecting key moments to a gimbal system.
Actors would be filmed in cockpit and fuselage mock-ups on motion base gimbals on LED wall stages (one large horseshoe volume for cockpits, a flatter volume for the fuselage, and smaller LED walls for ball turrets and other setups). The idea was to immerse the actors and crew in pre-rendered flying and battle scenes and use that imagery for what was effectively highly accurate interactive lighting.
Previs imagery from The Third Floor would be turned over to DNEG Virtual Production (a partnership between Dimension Studio and DNEG), led by managing director Steve Jelley, for readying as real-time Unreal Engine imagery for the LED wall, which was run by Lux Machina. DNEG, and some other vendors, then took that cockpit imagery and generated views outside the windows, as well as hundreds of exterior sky-level plane shots.

The Unreal Engine imagery proved extremely useful for actors to react to planes, flack, explosions and other moments during the shoot. However, what was on the LED walls was not intended to be final pixel imagery. Still, it became a useful part of the process, states Bernasconi.
“The actors had grueling days because they were in their cockpit for hours at the time. But what it helped with by having them shooting there with the LED walls was making them feel so much a part of the scene. On top of that, you see explosions, you see you’re flying through clouds. When the talent saw it, they were like, ‘Holy crap!’ They were pretty impressed with it.”
In early testing, Bernasconi says he and Rosenbaum took a test flight in the cockpit gimbal and soon became nauseous. The reason was, no wings appeared at that stage in the real-time imagery on the LED walls. “There was no correlation for what was going on outside the cockpit, so we knew we had to add the wings, otherwise the actors were going to suffer. We had to come up with a system where we were motion capturing the gimbal system, feeding it back into Unreal and then rendering the wings on there also. All this occurred with only a 7ms delay and still made it feel real-time. That was one of the many challenges that we experienced throughout the production, because we were doing so much of it for the first time.”
Getting started on cockpit VFX shots
Bernasconi knew that after principal photography, one of the biggest challenges for DNEG on Masters of the Air would simply be the scale of the work (for DNEG it would be something like 2000 shots in 18 months). Cockpit shots were a major part of this, with views outside cockpit windows all being needed to be both generated and composited in. So one of Bernasconi’s first tasks was to generate some fast rotoscoping of cockpit shots, with the idea of having at least something to work with. At the time, certainly, some machine learning roto solutions had emerged, but there was a problem, as the VFX supervisor explains.

“I couldn’t use any of the available roto tools because of the masks that the pilots and crew were wearing,” he says. “Any model that I used to try to do segmentation for the rotoscoping would fail. So, I had to go in and train my own model. I find this aspect of my job to be one of the most exhilarating, developing and testing new technology.”
Ultimately, some quick rotoscoping of the cockpits was done, but nowhere near final (indeed, Bernasconi mentions what a fine art final rotoscoping is, and eventually the plates would all be sent for rotoscoping as normal, but the quick roto was crucial in getting started with the VFX work as early as possible).
In addition to what was outside the plane windows, DNEG faced dealing with the windows themselves and the reflections, scratches and light transport through them. This involved a number of visual effects solutions.
The first was the reflections of pilots and crew. “We created digi-doubles for every pilot and crew member, and we body tracked every single one of those–every single one in every shot,” notes Bernasconi. “And if they were not in the shot because they were outside of the field of view, we used motion capture clips for any crew member that would be visible in the reflection, in keeping with the action of the surrounding shots. We’d then render reflection passes of those body tracks for all shots and comp them in.”
Dealing with elements for outside the windows began first with the generation of three levels of CG laminar flow around the plane. The idea here was, even though you don’t see this particular airflow, it would be needed to distort the images of the backgrounds you see through the windows.
A scratch pass on all the windows was also undertaken. “There were so many variations,” says Bernasconi. “We don’t have a single window that is the same. And with the progression of the show, the weathering increased. We also rendered passes for the way the light interacts with those scratches. So if you’re banking the plane, those scratches catch the light in a certain way.”
Light modulation was yet another consideration. The light in the cockpit was modulated by the laminar flow rendered for outside so that the cockpit did not contain only static lighting. Light contamination from explosions, bullet hits or tracers going by into the lens were also dealt with.
Then there was camera shake. To accurately reflect the kind of movement inside the cockpit and cabin, DNEG developed a system in Nuke that had four levels of shake at different frequencies and amplitude, as Bernasconi explains. “First, there were external elements like flak, bullet hits and turbulence that gave a low frequency, high amplitude camera shake. Then we had engine camera shake, usually a high frequency, low amplitude camera shake. Then we had what we called de-coupling. Let’s say that you have a camera on the plane, it would have a delay and a dampening system on the actual camera itself that would try to compensate for the shake, but add a little bit of wobble.”
Interestingly, quite a few of the POV shots DNEG crafted for the show were actually fully synthetic. They are mainly where you do not see a pilot in frame.
Planes in motion
DNEG built the majority of the plane models featured in Masters of the Air, sharing assets with other vendors where required. These builds started with scans of real planes carried out by Clear Angle Studios, as well as mock-ups made by BGI Supplies for the show. Bernasconi credits DNEG assets supervisor Jessica Lee with paying particular close attention to detail with the planes, knowing that they would be heavily scrutinized by war historians.

“Thank God I had her. You need a certain type of person to deliver the level of detail that was required for this show. It was incredible the amount of work that her team did. She is someone extremely driven about attention to details, very methodical, extremely capable from a modeling point of view. Jessica re-did every single plate on the B-17, and every single rivet. She created a way to distribute the rivets across the fuselage.”
Texturing plane detail and delivering livery such as call signs and numbers were extensive tasks for DNEG. One tricky aspect of the B-17 paint work texturing, observes Bernasconi, was the way it behaved in different lighting. “The metal paint of the non-chrome version was extremely challenging. It has a certain wide specular lobe, a certain way that the specular reacts. We had multiple layers of pigment on the actual metal trying to replicate that sheen to the highest degree.”
Damage was also crucial. Here, Bernasconi recognizes the efforts of DNEG CG supervisor Douglas Tancredi. “He created a modular destruction system so you could replace any parts of the wings and fuselage with exploded or rigged parts. Stephen Rosenbaum was very particular in that sense, he wanted explosions to create the look of the metal twisting on itself, not just to have a gaping hole, so you couldn’t just do it with a normal map. It had to be modeled and then procedurally placed on 100 planes.”
DNEG benefited from production providing them with an infographic booklet relating to each plane for each mission that addressed whereabouts the plane had been damaged–“We had to match that to our CG models,” says Bernasconi. “All the bullet hits on our planes are actually based on historical data and they’re placed where they happened.”
In terms of animation of the planes, Bernasconi worked with DNEG animation director David Andrews to deliver heavily choreographed–yet historically accurate–scenes. “The challenge here was that we wanted to stay true to the physics. We developed a system that restricted animators from being able to make tighter turns than what the plane would be allowed to do at that particular speed. Also, every single part of the plane was rigged: flaps, ailerons, elevators, stabilizers to perfectly portray the plane’s behavior during certain maneuvers.”
What helped, too, was the virtual cinematography for the flying scenes. Rosenbaum dictated that there be no ‘floating’ cameras. “If there is a camera outside of the plane, it’s because it’s attached to another plane,” advises Bernasconi. “That put restrictions on how we could portray the story—the constraints were good and helped us be more realistic.”
Among the clouds
Below the planes, DNEG modeled every single kilometer that the planes flew. This was a major project in acquiring satellite imagery and crafting geo-spatial textures for the European areas the planes traveled through. One challenge, however, was that the satellite imagery was modern day, necessitating a significant paint process.

“I had an entire team painting out every highway, every modern building and replacing them with fields and older roads,” discusses Bernasconi. “There’s no gray for concrete or asphalt and only dirt roads. And then for every bombing target, we went and looked at historical photographs and repainted every single detail.”
Just as these ground environments were crucial for showing viewers where the planes were, so too were the clouds through which the planes fly. The amount of cloud cover was sometimes a story point in terms of visibility, while also serving to give the planes scale and a sense of speed. Bernasconi engaged DNEG environment supervisors Louis Melançon and Carsten Gomes to come up with a fully 3D cloud solution, and one that was more physically accurate than he had seen done in other flying sequences in other shows and films. They also had to match what was known about the real clouds from the actual bomber WWII missions.
“For every mission,” recounts Bernasconi, “we knew the visibility pattern at any point. From a scale of 1 to 10 we were told, historically, how much the pilot could see, and based on that, we wanted to replicate that look with our cloudscapes. The first thing I did was create a cloud atlas. It was hundreds of images from the web catalogued using machine learning tool based on types like cumulus, stratus etc. Stephen and I would go through the cloud atlas to identify the clouds we liked, and then I gave them to Jessica Lee and her assets team to create 3D low-poly representations of these clouds.”
To then get to a desired photoreal cloud shape, Bernasconi reconsidered how clouds have traditionally been built in VFX. “When you look at clouds in films, what we don’t have is the non-linear deformation. Why? Because you tend to have a 3D noise applied to a volume. So instead we went the other way around. I wanted to see the vapor twisting and curling around. I wanted the level of detail to be higher frequency at the edges, but not so much in the center.”
DNEG artists then created such cloud simulations in Houdini by running simulations of water condensing into cloud shapes. These were saved as caches, with two levels of voxels created for each of the clouds. “We were then able to blend between them to create softer looks around the edges,” says Bernasconi. “We did that for 100s of clouds. Using the reference images, we would put them altogether to create a macro cluster cloud, and blend them into one voxel and then render them in 3D.”
Authenticity in the air
The synthetic clouds are just one example of where Rosenbaum and DNEG sought to bring the World War II bombing stories as accurately to the screen as possible. “Stephen would always ask me if what we had done was ‘correct’,” notes Bernasconi. “He was so driven to make sure everything was accurate. We would triple check every time. I mean, even something like the plane pieces in the air in episode 105 falling around the planes–they are dropping at the correct speed. You are just moving forward so quickly that they seem to stay where they are, almost like they are floating.”

Another example Bernasconi offers is the smoke screen seen by the pilots that appears to be concealing Trondheim in one bombing run. The actual direction of the wind at that time was determined and then calculated in DNEG’s smoke simulations, so that it matched directly.
“We asked ourselves at the beginning, do we go for something that feels right, or that is right? We agreed that we would always focus on what is real and remain true to every historical detail. The story of Masters of the Air and the heroics of those involved needs no embellishment.”






