A ‘Midnight Sky’ VFX conversation—Part 1: LED walls, spaceships and CG face replacements

December 23, 2020
Gray-shaded digital astronauts by Framestore. Courtesy Netflix.

How the film used virtual production and multiple VFX methodologies.

George Clooney’s The Midnight Sky, now streaming on Netflix, follows action both on Earth (in the Arctic) and in space (mainly on the spacecraft Æther). The filmmakers looked to new virtual production methods, in particular LED walls using ILM’s StageCraft technology set up at Shepperton Studios, to help realize icy Arctic environments.

Meanwhile, the Æther was a completely CG creation by Framestore, and also required extensive spacewalks to be visualized. Then, another aspect of the production involved photoreal CG face replacements, partly arising from actor Felicity Jones’ pregnancy and partly to help complete several complex shots.

In part 1 of befores & afters’ coverage of the film, we look at the LED walls, the space work and those face replacements with Midnight Sky’s VFX supervisors, Netflix visual effects supervisor Matt Kasmir and Framestore visual effects supervisor Chris Lawrence.

George Clooney with first AD Lee Grumett, grip John Flemming and cinematographer Martin Ruhe on the StageCraft/LED wall set for ‘The Midnight Sky’. Courtesy Netflix.

Shooting with LED walls

b&a: It feels like you were able to use a whole range of techniques in VFX that empowered the filmmakers here, especially virtual production techniques—what were they?

Matt Kasmir: In general, we were using various technologies instead of bluescreens. We were using a Rosco gel, which was a neutral 50% grey, but behind it were SkyPanels. This meant you could have a bluescreen if you wanted to put color into the SkyPanels, and it would diffuse it beautifully—you’d have the perfect bluescreen. But it also meant we could use the lighting desk controllers’ iPad. We could adjust backgrounds perfectly for what we were envisaging to go behind and that worked for space and for the Arctic.

Chris Lawrence: One of the issues for the production was that there was a young child who couldn’t do the extensive location work in hundred-a-mile-an-hour winds. So a lot of that Arctic journey had to come back to the studio and that needed to cut seamlessly with the location shots. It was quite an interesting application of the technology. Basically, like a HDR on the dome, except that the dome was SkyPanels and soft diffusion through this Rosco gel. In post, when we were looking at the shots, the only way you could tell which one was which, was George’s beard, which, instead of having icy snow, it had a slightly waxy snow.

StageCraft/LED wall set-ups. Courtesy Netflix.

Matt Kasmir: Even practical snow made of ice and water on stage looks dead. And the reason being is, it’s reflecting the environment subtly. And even though you can’t physically see the environment, you take that out and it just looks dead. Whereas the Rosco just flooded it with light without burning it out.

We could also do crazy tricks because the SkyPanels acted as a very low-res screen. I was creating 420 by 420 QuickTimes of aurora borealis’s, and they were piping those through the SkyPanel and using those to light. There was a whole scene with the aurora that was eventually cut. But again, it worked so successfully. It just gave us movement. We could plot the direction of the sun so that there are times when he was lost in the snow where a very faint outline of the sun coming through was his only source of orientation. We could control that. So he didn’t have to move as much. It was something I would love to use again and again, and again, along with our LED wall.

This was ILM’s StageCraft, and the irony was we were using StageCraft before COVID. Now obviously, it’s the technology du jour. But we wanted to do it because James Bissell, our production designer, was building this very reflective set and the thought of having just a bluescreen outside these huge windows was just a bit depressing, frankly.

Chris Lawrence: In particular, you can’t frame for a reflection that you can’t see. I guess you could use real-time tracked camera to do that, but it would be a poor imitation of getting real reflection. So I think Jim’s vision of it was to be poetic—literally the reflections were part of the design. He wanted an old man in a reflective building. It was an image that he really wanted to really lean into.

Interiors of the Arctic Barbeau Observatory. Courtesy Netflix.

Matt Kasmir: And going back to the whole post thing, I’ve been burned before with the kind of après-ski look. Because, everybody wants to expose for the interior, and then worry about the fact that if you’re exposing to the interior, you’re going to blow out the exterior. So you tend to come up with this very compy-looking background.

It was daunting in-camera. But because we had a tight post schedule, the thought of getting all of our Arctic Barbeau environments in-camera was also very appealing. It was always possible to change in compositing, but the overall effect was going to be locked down to something we didn’t have to think about. And then there was some lead-up time. I had to go to Iceland to shoot plates. We shot on a five-camera array, they were all ALEXA Minis. And, as luck would have it, that was the only snow we saw in Iceland, was the one day that I shot the plates.

Chris Lawrence: I don’t think that ILM had done an exterior window view with StageCraft before this show. We had the ability to change time of day, weather conditions, skies on the fly. So it made it incredibly useful.

Matt Kasmir: There’s also a shot where Augustine is putting the goggles and mask on to Iris before they head outside, and you can see the exterior reflected in her goggles. You don’t actually physically see outside but you just get it reflected in the sets and the googles. And that would have been an impossible shot, that would have just been a particularly dull shot had it not been for StageCraft. It’s those little nuances that I’m particularly proud of.

A further StageCraft/LED wall set-up. Courtesy Netflix.

Making the Æther

b&a: Where did you begin in terms of the outer space scenes?

Chris Lawrence: We started by pre-vis’ing the whole spacewalk scene and also the sinking pods (the spacewalk was done with Nviz and the sinking pods were done with The Third Floor). Both used a V-cam, and they were lensed by the DP Martin Ruhe and by George as well…

Matt Kasmir: …to the point where Nviz even incorporated a lot of Martin’s detuned lenses. He’s always liked various lenses. There’s a famous one from the film, The Assassination of Jesse James by the Coward Robert Ford, which is actually now called the ‘Jesse James lens’ where the second lens is reversed. So you get an inverse de-focus around the edges as you focus the centre. So we had all of these in our armoury in our virtual camera, and it just meant that when we went into shoot this, we shot almost shot-for-shot what was previs’d via the virtual camera. And unlike normal previs, there was ownership because George and Martin had physically filmed this. It wasn’t someone at a computer typing in numbers to create a track.

The Æther. Courtesy Netflix.

Chris Lawrence: For the design of the Æther ship, that was quite a big deal. It was very much led by Jim Bissell, but we also installed a visual effects art director by the name of Jonathan Opgenhaffen. And then there was a concept designer, Stevo Bedford. They designed this ship underneath Jim directly, even though it was only ever going to be built as a digital asset for the exterior, with little proxy set pieces. And then that was also scouted in VR and kind of lensed with the iPad–very much the design of the configuration of modules from the airlock through to the shield and where the action takes place during the spacewalk, was all very much considered as a set design.

Matt Kasmir: Each part of the ship had its own backstory. The central trunk been bastardised from what was up in space already. And then the swinging arm was 3D printed because apparently half the cost of getting anything into space is launching it. So NASA are genuinely looking at 3D printers, firing a 3D printer into space. In fact it’s based on the Mars habitation project.

CG face-replacements

b&a: I know that you had some CG replacements on the show—how did that come about?

Chris Lawrence: One of the other interesting things that happened on this show is that during pre-production, we found out that Felicity was pregnant and they wrote her pregnancy into the story. She was an early victim of the travel ban because she couldn’t actually fly to the location that we were going to shoot the planet K23 in. So we started doing R&D into whether we could use facial capture and do a CG face replacement for her. And then that led us down a path of knowing how difficult it is to do zero-G space shoots and all of the motion control and all of the difficult stunts.

Gray-shaded digital astronauts by Framestore. Courtesy Netflix.
Final rendered shot. Courtesy Netflix.

And looking at that as a whole in the context of our production plan and shooting schedule and thinking, well, if we can do a CG face, we would be able to—looking at the previs that we’re making—we can see that actually you could shoot this stuff and this we could all do using facial capture. So we were forced into that route of thinking because of Felicity’s not being able to fly, but then it became a really critical tool for the rest of the sequence. And so basically anything that’s wider than like a Cowboy Shot was normally a CG face or a background character that’s in the background of a shot was normally a CG face. And we did a capture, we used Clear Angle’s Dorothy system to get a really good high-res scan.

Then we used Disney’s Anyma system to do the facial performance capture. That gave us everything except for the eyes, which obviously are the trickiest bit. And then you have to really hand-track them very meticulously. And we’ve got some quite amazing side-by-sides of the various actors, and my favorite one’s of David, he plays Adewole. And it’s the Anyma footage next to our CG render. And I’m not saying they’re indistinguishable, but you definitely double take, like, which one’s which.

For the usage that we had, which was up to that 500 pixels high, which is quite high, that’s quite a close CG face for us, it was remarkably effective. And it was funny when at some point during the editorial process, we’d gotten asked by the editor if he could just have an audit of all the times we’ve used the CG faces, because he’d lost them. He couldn’t find where they were in the cut. And he just wanted to make sure that they were to his specification in terms of his selected performance. So I think it speaks to the kind of invisibility of that work that that was what ended up happening.

Stand-in actor for Felicity Jones for a scene on planet K23. Courtesy Netflix.
Final shot: face replacement by Framestore, environment by One of Us.

Matt Kasmir: We replaced around 50 or 60 didn’t we?

Chris Lawrence: Yes, around 50 or 60. What was great about it was it really opened up the shooting of that spacewalk scene, which was set to become quite a technical exercise with motion control.

Matt Kasmir: For outside the airlock, we originally had seven days scheduled to shoot. And we did it in three because we were just crossing off shots. Then it was two days in the airlock interior.

Chris Lawrence: We had a really nice collaboration with Jenny Eagan, the costume designer who is an old friend of Matt’s and I, and I’m a huge fan. And she really interfaced with us in terms of the space suits that would ultimately be a lot of time digital, right down to the construction and really involving us with FBFX who had done all the 3D printing stuff and the physical construction of the costume. And they were sharing CAD things and super close-up material samples that were giving us additional detail.

We obviously did all this back on Gravity as well at Framestore, but it’s so much better now. I remember our CFX lead at some point, Mike Thompson, sent me an email, just going, ‘This suit is so much better than what we did on Gravity.’ It had to work 4K, close-up. And the brief was that it should be indistinguishable, that you should not be able to tell the difference.

And that proved useful because on the interior shots, where we had planned to do mainly in-camera with wires—although I think we always planned a bit of CG double-replacement maybe for the wider shots—but that was, again, it was quite difficult to shoot and it was performance driven in terms of it being a very emotional moment. And just the staging was complicated because you had to get three actors very close to each other and you’re playing with there being no ‘up’. So the camera’s rolling and you’ve got set pieces flying out and things like that. And the whole thing was just quite complicated. There’s an interior shot with the three actors that in the end was completely CG with these facial capture faces. We did shoot the scene…

Matt Kasmir: …but it was a bit clunky.

Chris Lawrence: Yeah, you could feel the wire movement. And to have kept those performances, I think, we would have had to cut around it or make other creative choices. Whereas we were able to say, ‘Actually our digital assets here are so good, let’s just use them.’

Plate photography. Courtesy Netflix.
Plate flipped. Courtesy Netflix.
Final shot. Courtesy Netflix.

Matt Kasmir: Also, there was no motion capture. So a lot of these fully CG shots are all hand-animated by Framestore. You might use roto-mation as a starting place, but then a lot of it was totally created because the actors themselves were even at a loss for what they were doing. They’d never seen the outside of the ship.

Chris Lawrence: With the interior air locks in general, all the way through there’s a rolling camera doing that, and that was something that was originally—it was all talked about in prep—but the complexity of it, I mean, you’d have to invent a circular film format to film that and have control in post. Ultimately that was done by extending the sets and the space suits quite often in post. Steven Mirrione, the editor, did a pass, and once he had an assembly or a fairly locked cut, because it was really critical that the roll from one shot to another match. And that’s why it was shot that way, because you never would have known where you were cutting. Then it was just invisible effects work to extend the shots.

More to come at befores & afters on Midnight Sky: more zero-g work, an ice fall and holograms.


Subscribe (for FREE) to the VFX newsletter




3 Comments

  1. A lot of references to 3-D printing in the film. Interior framework of the Aether looked like it was algorithmically designed for both lightness and maximum structural integrity. After the first meteor strike, one of the character mentions “printing” replacements for damaged exterior parts. Clooney’s survival rifle had a hollow 3-D printed stock – I’m curious about who in the production team came up with that idea. That said, I doubt that sort of weight saving would be that crucial, even in the Arctic. In face, a hollow but non-porous synthetic stock would probably be better because it would have room for spare shells, cleaning kit, etc. I’m pretty familiar with guns, and am somewhat familiar with how polar researchers kit themselves out, but if anyone wants to chime in on that, feel free!

Leave a Reply

Discover more from befores & afters

Subscribe now to keep reading and get access to the full archive.

Continue reading

Don't Miss

Watch One of Us’ VFX breakdown for ‘Constellation’

See their behind the scenes reel.

‘Jar Jar changed movies’

Actor and animator. Interviews by Ian Failes.