Air pods and office spaces this season relied on new virtual production techniques, something that goes back all the way to the beginning of the series.
When season one of Westworld was being made in 2015, co-creator Jonathan Nolan had, relates visual effects supervisor Jay Worth, recently observed the use of projection techniques in Interstellar for outer-space shots. It was something he thought could perhaps also be utilized on his upcoming Westworld.
“Jonah came to me,” recalls Worth, “and said, ‘What if we figure out a way to use a projection screen and a game engine and build an environment and then put a sensor in the room and a sensor on the camera and have it translate in real-time to the camera.”
Foreshadowing the explosion of LED screens and virtual production approaches now seen in projects such as The Mandalorian, Worth and his team embarked on that first Westworld series on a project to see if Nolan’s idea could be realized.
“I found an artist and we built our own super computer – it was this crazy liquid-cooled machine. We built the interior core of the Mesa and we tried to figure out an environment that worked as a test case. We modeled that and brought it into a game engine, and we did some tests and it worked better than we thought. But the video cards just weren’t quite there yet in 2015 and we didn’t end up doing it.”
Ultimately, projection was used for the three-dimensional park map in the show, and LED walls with pre-rendered 3D backgrounds were relied upon for glass elevator scenes in lieu of bluescreens. It wasn’t until production on season three of Westworld that Nolan and Worth re-visited the more significant use of the technology.
Setting up an LED approach
Indeed, Nolan imagined that LED wall and Unreal Engine technology could be extremely beneficial for proposed air pod and other scenes in season three, and wrote those scenes with the tech in mind.
“Jonah was talking to me about all this in December of 2018 when we scouted [filming locations] Spain and Singapore,” says Worth. “We talked extensively about going back to the game engine idea and figuring out how to use it in season three.”
Worth enthusiastically embraced the idea, which would have the benefit of avoiding bluescreens, immersing the actors and crew more into the scenes, and be used to craft in-camera VFX shots in order to reduce the need for post-production.
Nolan and Worth were able to visit the set of The Mandalorian and take away some key lessons about how the filmmakers there were using game engine and LED techniques. Westworld did not necessarily have as many exotic environments to concoct virtually, but Worth saw the value of using screens for the flying scenes and a key office environment.
An LED shooting set-up made use of Fuse LED screens, with Profile handling the on-set logistics, and Epic Games’ Unreal Engine being the game engine of choice. One of the key considerations was shooting the scenes on film. It meant there was a longer process in working out the correct film stock, lighting and color correction necessary, but, says Worth, “once we figured out the recipe it all flow together nicely.”
For the office environment – that of Charlotte Hale’s (Tessa Thompson) – a LED screen set-up was established that allowed for camera movement, ie. the background imagery would adjust in real-time to suit the change in parallax. This imagery began with a live-action section from the shooting location at the City of Arts and Sciences in Valencia which was then further enhanced with CG built by El Ranchito and inputted into the game engine (Unreal Engine 4).
“We had it all optimized to render in real-time,” notes Worth. “As the camera moved, it would parallax with the camera. We could move the sun and all the water reflections were real-time, as well. Later in VFX, we didn’t touch a single background for Hale’s office when it was an Unreal Engine scene.”
In the air
Meanwhile, the air pod scenes over a future Los Angeles also used the LED screen set-up, but did not rely on real-time rendered backgrounds. Instead, the LED screens displayed playback of VFX imagery during the shoot. “We didn’t need the real-time so much here,” observes Worth. “Things didn’t need to translate to camera since it was pretty much fixed inside the pod, so we realized we didn’t need to have that be created in a game engine.”
For those air pod backgrounds, live-action aerial photography of Los Angeles started with a camera array shoot. The resulting footage was augmented with CG buildings by Pixomondo. The angles were established for shooting and then that footage projected on the LED screens while the characters were filmed in an air pod mock-up on the LED screen stage.
The resulting in-camera footage did receive some alteration in post, largely because the future LA world was still being expanded upon as production continued, as well as for continuity reasons. “The majority of it was still captured in-camera,” says Worth. “The nighttime scenes, in particular, were all captured in-camera.”
Interestingly, for nighttime scenes of Dolores (Evan Rachel Wood) travelling in the air pod with Martin (Tommy Flanagan), Flanagan was not available for the shoot. “That meant,” reveals Worth, “we shot Tommy’s side on bluescreen and Evan’s side on the LED wall. We actually used interactive LED panels to project the light onto him that matched the interactive light that we had for the background and the foreground on Evan when we shot her.”
The future of filmmaking
Worth is adamant that LED screens, virtual production and real-time rendering are the future of film and television production for creating immersive shooting environments, crafting in-camera VFX shots and especially now with the current filming limitations brought on by the coronavirus crisis.
“We’re looking at a whole new world,” details Worth. “We might have to say, ‘Hey, we’re just going to shoot a wide shot here and not shoot around people and not do coverage, but instead build and create those environments with LED screens. I think we’re going to have a lot more flexibility and freedom to do that in the future. I think it’s going to be another tool in the toolkit, and it’s going to be even more valuable than we thought it would be.”