And it involves getting a whole lot of VFX shots in-camera.
At the Unreal Engine SIGGRAPH User Group last week, the surprise guest Jon Favreau talked about the use of LED walls during the making of The Mandalorian. Last week during SIGGRAPH, I got to see an example of how one of these LED walls – and the real-time rendering tech behind it – works.
Epic had set up the technology, and some example scenes, as a virtual stage at a nondescript warehouse in Los Angeles. Here, in partnership with Quixel, Magnopus, Lux Machina, Profile Studios, ARRI, and DP Matt Workman, Unreal Engine showed off what the LED wall set-up was capable of; in-camera VFX, VR scouting, and the ability to do lots of on-set tweaking.
The exciting part was, for me, to see how a filmmaking workflow has (and will likely) change – to bring a lot of the work much earlier in the process and possibly to reduce the work in post. Of course, we’ve seen some of this before with rear-projection and LED walls, for example, in First Man and Solo: A Star Wars Story. Other upcoming productions are also reportedly using LED screens to aid in acquiring in-camera VFX.
In the demos we saw at the Unreal Engine virtual stage, an actor on a motorbike stood in the middle of a group of 4 LED walls (2 at the sides, one curved at the back and one as the ’sky’). He was positioned on placed dirt and sand. Behind him was a digital environment being rendered in real-time. This meant it could be moved around, changed on-the-fly, re-lit, et cetera.
By shooting the actor in a certain way with a real camera, the seam between the digital and real world was indistinguishable. In fact, to get the right parallax shifts as the camera moved around, the environment could also move around; in real-time.
The LED panels provided light – and interactive light – onto the actor. He could ‘ride’ his bike in the environment and there would be no need to ‘add’ that environment later in post. Granted, this worked with a particular environment. Others might just be there on the stage to serve only as reference or interactive light, but it was a big deal to see.
A few other things were also demonstrated to us. One was the projection of a greenscreen on the LED panels, providing an instant way to key the actor off the background.
Another included the way that Unreal Engine’s nDisplay was used to merge the imagery on the different LED walls, which could even be used to act as a sun or source light or, it seemed, anything you could imagine.
For more from the stage, check out Unreal Engine’s video below. Be sure to watch for future productions using this kind of approach from now on.
This week at befores & afters is #gettingshotsdone week. Find out how several productions are getting shots done with a range of different methodologies.Buy issue #1 of befores & afters in print