You are going to flip when you see this video of how ‘The Mandalorian’ was made

The Mandalorian VFX

Video breaks down use of LED walls, Unreal Engine, ILM’s StageCraft and other tech

You may have already read a few things about the making of The Mandalorian, but now you can see it all in motion with this incredible video that showcases the virtual production approach behind the show.

The way that many in-camera VFX shots were achieved and interactive lighting obtained on a stage in Los Angeles using real-time rendering techniques is mindblowing and certainly harks back to the days of rear-projection, but then clearly goes much further. All this is outlined in the behind the scenes video.

The solution for The Mandalorian came about via a collaboration between ILM’s StageCraft Virtual Production Platform, Epic Games’ Unreal Engine, and Golem Creations (Jon Favreau’s production company). Technology partners included Fuse, Lux Machina, Profile Studios, NVIDIA, and ARRI.

The series was recently recognized at the Visual Effects Society VES Awards, taking out the Outstanding Visual Effects in a Photoreal Episode and Outstanding Model in a Photoreal or Animated Project prizes.

The Mandalorian VFX Shot

In a press release, ILM and Epic Games noted some of the big game-changers in crafting the series:

Over 50 percent of The Mandalorian Season 1 was filmed using this ground-breaking new methodology, eliminating the need for location shoots entirely. Instead, actors in The Mandalorian performed in an immersive and massive 20’ high by 270-degree semicircular LED video wall and ceiling with a 75’-diameter performance space, where the practical set pieces were combined with digital extensions on the screens. Digital 3D environments created by ILM played back interactively on the LED walls, edited in real-time during the shoot, which allowed for pixel-accurate tracking and perspective-correct 3D imagery rendered at high resolution via systems powered by NVIDIA GPUs.

The environments were lit and rendered from the perspective of the camera to provide parallax in real-time, as if the camera were really capturing the physical environment with accurate interactive light on the actors and practical sets, giving showrunner Jon Favreau, executive producer and director Dave Filoni, visual effects supervisor Richard Bluff, and cinematographers Greig Fraser and Barry “Baz” Idoine, and the episodic directors the ability to make concrete creative choices for visual effects-driven work during photography and achieve real-time in-camera composites on set.

Congrats to everyone who contributed to The Mandalorian. It’s an exciting time in virtual production and visual effects right now, and also exciting, I think, to ponder how this tech can be used in so many other films, television, and streaming series and games into the future.

Leave a Reply