Zero-gravity frame rates, deep fakes and moon scrapes: the VFX of ‘For All Mankind’

An in-depth conversation with VFX supervisor Jay Redd.

The second season of Apple TV+’s For All Mankind—created by Ronald D. Moore, Ben Nedivi, Matt Wolpert—has just wrapped up, with this series presenting a dazzling array of challenges for its VFX crew, led by production visual effects supervisor Jay Redd.

In this befores & afters deep dive, Redd discusses just some of those challenges, including crafting an expanded moonbase and several spacecraft, the effects of a solar flare on the lunar surface, and making historical figures say things they didn’t actually say.

Other aspects of the season 2 production Redd highlights involved coming up with a convincing way to sell zero-gravity (hint: a certain fps was used), dealing with an array of stock footage at different fidelities, creating a shocking fire inside a space helmet, and delivering a dizzying final pull-back shot to hint at what’s next for the show. Plus: visors, a common but always tricky challenge in any space project.

Get the details here with Redd, but a warning, this interview contains spoilers.

b&a: How did you approach shooting scenes on the moon?

Jay Redd: At Sony Studios in Culver City, sometimes we’d have a smaller plot of the lunar surface, with the regolith and boulders and gray gravel and rocks. And then we would shoot in Long Beach at an old Boeing aeroplane manufacturing and repair warehouse that is big enough to park a 747 inside, and we would create a much bigger plot of moon so that we could actually drive rovers or have astronauts run. We were limited a little bit on season 1 by size and we decided to go bigger for season 2, because now there’s 10 astronauts on the moon at the Jamestown base.

Dan Bishop, who’s our production designer, and I worked together on plotting out where boulders would be, where berms would be, the kinds of measurements that we would need for elevation changes, could we shoot this direction on one sequence, shoot the other direction on the other sequence? But never moving the sun.

b&a: Actually, tell me about the sun, or what you shot for the sun.

Jay Redd: We shot with a 100K light. It’s one source. We’re not using other sources, because the moon is the moon, and it’s one sun, and you get all this amazing bounce from the regolith. So that just never moved, ever, because it was sitting on forklifts.

Also, there’s no atmosphere on the moon, it’s a black sky all the time, so we surrounded our entire set with black instead of green or blue. Some people might go, ‘Well, wait, how are you extracting mattes and keys?’ Oftentimes, we’d be able to pull mattes just from exposure and from luminance. And then the unsung heroes of modern visual effects are rotoscope artists. They do such an incredible job on extracting subjects in complex backgrounds.

“I want it to feel like it’s raindrops on the moon.”

One interesting thing was that in the Boeing location, we fought our own atmosphere in the air. Sometimes you’d open the doors at 7:00 in the morning and there would be mist. Southern California was also having these massive wildfires pretty nearby, plus we’re next to the Long Beach airport, so there’s exhaust, and a little bit of fog. Now, imagine shooting against a really big light— because we love backlighting, who doesn’t love backlighting, it looks cool, coming from behind you. But the mist would create this atmosphere in the scene. It’s like, ‘Oh, no, no, the moon is supposed to be crystal clear. We’re not supposed to have any of this!’ This meant there was some pretty complex work done to get rid of fog and get rid of atmosphere in shots to clean up the backgrounds.

b&a: In the show, I thought that not a lot was necessarily made of the weightlessness on the moon while in the base, but I’m guessing there was wire work for some of the scenes on the surface?

Jay Redd: You’re correct. We did take some creative license. We know that the moon is 1/6th of the Earth’s gravity, and so there’s definitely wire work for the exterior shots. But for the interiors in Jamestown, we kind of just let everybody walk around. Certainly, some people might say, ‘Oh, it doesn’t look like 1/6G inside Jamestown.’ You could argue that that’s true, but also that the astronaut suits are pressurised when you’re outside, and it changes the range of motion, which is why you always see people with their arms out kind of like the Michelin Man. Part of that is because the suits are pressurised and they have limited rotation.

In terms of the wire work, Todd Schneider was our stunt coordinator, and there were lots of cables, lots of wires, but the secret sauce of shooting is our frame rate. We shot at 32 fps. We found that 32 frames per second was the ideal look for exterior moon shots.

b&a: How did you come to that? Just from testing, or had you done something previously that you knew gave that feeling?

Jay Redd: Well, it’s funny. It came from testing, but we all had brought different kinds of experience. I’d done a little bit of underwater stuff before to make scenes look like it’s more dreamy, for example. And then even back on Babe, when I was a TD at Rhythm & Hues, I had learned that a lot of the animals were shot at 26 frames per second, which gave them a little bit more weight. Because, instead of feeling so manic and crazy—cats and dogs and pigs can move really fast—it just slowed them down by a couple of frames, and you just get a little better feeling. You’re also cutting against human beings which are more massive and they move slower.

“One interesting thing was that in the Boeing location, we fought our own atmosphere.”

So, the 32 frames a second was the big secret sauce for exteriors. In addition, when you’re having somebody run, you can take giant leaps, say 10 or 15 feet, and that’s where the cables would help take some of that mass away and some of the gravity away.

b&a: One of the hot topics right now is virtual production and LED walls, but it doesn’t always work for every production. Did you look to that here?

Jay Redd: We did look at it quite a bit. On season 1, we experimented a little with it, outside the LSAM, which is the lunar lander. We had some LED screens outside for the moon. It wasn’t done in Unreal Engine and it wasn’t real-time rendered playback, but we had planned some moves so that when you were in there, you saw the level of the moon change in the windows. But honestly, we ran into problems with shifting colors against the panels and all that kind of stuff. We ended up replacing a bunch of the exteriors with traditional compositing methodologies.

Then we talked a lot about it for season 2, but what’s tricky about a show like this is that we’re on a different schedule than, say, a Mandalorian. Those shows take all the post production work of designing, and decision making, and environment making, and move it to the front. You front-load everything suddenly, and most shows don’t get going like that. We would need months, and months, and months to get things ready just to start shooting, and then still have post production in the end.

I actually did some early stuff, even back on Men in Black 3, where we had LED panels outside just to use as reflections of Times Square. While we didn’t have it right outside the windows as a plate, we did use it for interactive light, and on windshields.

On For All Mankind, you might think, ‘This would be so good!’ even just for the lunar backgrounds. But here’s the thing to think about: visors. When you see a visor on an astronaut—and I’m sure you’ve talked to other people doing other sci-fi shows—those visors ultimately have to be 3D. Ninety per cent of the time, we’re putting in a CG visor.

“Most people have no idea how much work the visors are.”

Plus, we do a lot of storytelling in the reflections of visors. If we wanted to use real visors and capture the proper reflections in real visors, well, you’d have to put in a 360 degree LED wall, and you wouldn’t have crew members be able to stand in front of the visors. Immediately, I do the math in my head, and I say, ‘Well, you can’t do that. We’re doing CG visors. We’ve got to put a 3D environment in the background anyway.’ So, we’re just not there yet for full virtual production on this show. I think we’re certainly investigating and I’m thinking about it all the time.

b&a: Let’s talk about some specific shots. In episode 1 of season 2, there’s the solar flare impact on the surface of the moon. How did you approach that?

Jay Redd: That was an intense R&D effort working with Method Studios. For the initial idea, Ron Moore had said to me, ‘I want it to feel like it’s raindrops on the moon,’ and he kind of just left me with that. ‘Help me come up with something.’ So, I looked at it. Like, how are you going to move dust on the moon? There’s no air, right? There’s no atmosphere, there’s nothing to really influence anything movement-wise.

However, NASA has done some really interesting research on electrostatic and electromagnetic radiation, and there’s some fantastic research done into being able to move tiny particles off the surface. There are also accounts from Apollo astronauts, when they’re leaving the moon or approaching the moon where the sun is behind the moon, and they’ve said they see an atmosphere.

Garrett Reisman, a NASA astronaut, who’s one of our consultants, he and I started chatting about this. In the show, we’re saying it is a giant solar flare. They haven’t measured one this big before. We immediately go, ‘Okay, let’s take some creative license here. We’ve got to have it visual, have it be big.’

I started doing a bunch of research into electrostatic and electromagnetic levitation of dust particles, seeing the sketches from the Apollo astronauts of these amazing rays. Also, I have a musical background, and I’ve always been interested in cymatics or Cellini patterns. If you’ve seen the metal plates with grain on them, where you play with different tones and sine waves at different hertz, and you see these amazing designs come to life, that started to become a big influence to us. I looked at sand dunes, wind patterns. I looked at what water did on river beds, in terms of what kind of patterns would happen over time.

“You don’t steer a ship like you do a car on the Earth.”

I started putting these concept sheets together for Ron and saying, ‘What if it’s ripples or spikes? The spikes start to feel like those are literally energy, and the ripples of radiation are starting to come across the regolith, that feel a little bit like water, but we don’t want it to feel like water.’

It took many, many, many, many, many weeks, actually, it took a few months to arrive where we came to. I put together a pitch video with cymatics and sound waves, and radiation, and X-rays, and river patterns, and dunes, kind of like a mood board. I worked with Method in Vancouver—Craig Wentworth was the visual effects supervisor there for this.

We started combining stuff and trying to figure out how to get magnetic rays moving in different ways and yet also have columns of dust that come up periodically. That’s the gist of how that came together, was trying to make it work across the entire sequence, and making something tangible to affect the astronauts. Otherwise, it would just feel dead because it was this ‘unseen’ danger.

b&a: There’s some great shots of spacecraft on the moon, the lander, as well as wide aerials of a rover. What I love about these scenes, and it’s kind of related to the lighting you mentioned up front, is that it really portrays that harsh direct sunlight that you would obviously get on the moon. How tricky was that to do in CG?

Jay Redd: I think that’s a great question. I think the key behind it is the fact that we’re using photorealistic rendering now, like photon rendering, and I think that is really key. This would have been harder to pull off, let’s say 10, 15 years ago even. There’s something elegant about one light source, and then everything becomes reflective. We also have so much nice live action reference that we can look at that all the time. It’s kind of a creative miracle that the lunar surface was almost like an 18% gray card. It’s quite gray, and it’s not white. Our suits are white, and the surface is gray, but what’s nice is you always get reflected light back.

In deep space, if you just have a direct sun, there’s no fill light on the ship—it’s like it’s just black. It looks terrible. It doesn’t look right. So, we’re always saying, ‘Oh, a little star bounce here…’ But on the moon, you get this bounce automatically. So, it was rare on moon surfaces that I would have to ask for more fill light, say. If you follow the real materials and write your shaders correctly, you can get a lot of this from your photoreal rendering.

b&a: Then there’s the confrontation at the drill site that results in the shooting in episode 8. What were some of the challenges of filming that scene?

Jay Redd: For that one we needed an expansive landscape. On set we could only build something so large. In a way, it’s kind of the parallel to walking through a forest on Earth. Since there’s no trees on the moon, we have giant boulders, and they are fields of boulders, like a rock forest. We worked with Zoic on this—the VFX supervisor was Jeff Baksinski. We expanded the moon surface with more rocks. We had to create obstacles so that it wasn’t obvious that you could see around things, they’d have to peek around rocks. Or we’d use the visors to help show reflections of rocks.

We took some license with sun direction a couple times, too. We didn’t move it on set, but we would sometimes tweak where the sun was in the visor. If you pay really close attention, you’ll notice that sometimes we stick it in there, even though it was over here, say at 30 degrees. Again, you need to tell the story. And the reflections are so weird anyway—it’s like going into a funhouse in a circus, a warping mirror, you don’t know exactly what’s happening. So, we use that to our advantage sometimes.

The challenge was, it was actually a giant replication of dozens and dozens of extra rocks, and then it’s reflections inside a reflection inside a reflection. When you’ve got two astronauts together, and a camera, and a crane, and some shadows on the ground, it starts to become a really crazy puzzle. It’s like, ‘Okay, I’m painting that out, but I have to replace the astronaut’s visor in this one.’ And we go two or three levels deep in reflections back and forth of paintwork and rendering.

b&a: Did you ever get to the point where you thought, because of that challenge, you should do all CG suits and visors, or was that an impossibility?

Jay Redd: It’s not an impossibility, it’s just that, again, even though I’m a VFX person, I’m also an in-camera person. Imagine if you had to suddenly animate four other astronauts just in the reflection. I don’t want to animate astronauts. I don’t want motion capture astronauts. I don’t have time for that when we can get it in-camera and we have incredible paint people. But what it does require is that we are doing full 3D tracking to the helmets.

Every time you see a close-up astronaut, we have done a tight 3D track of that same helmet that has been scanned in order to deal with the visor and reflection. Again, the unsung heroes are paint people.

Most people have no idea how much work the visors are. When I came onto the show, I warned everybody. And now it’s just part of our language. I had a production meeting yesterday, for instance, and everyone’s like, ‘Okay, Jay, what are we doing with the visors?’ And I said, ‘Okay, gold visor down. When they lift it, it’s empty.’ The thing is, we’re going to replace the gold one anyway, but it’s nice to have it because you could see it, and we don’t see the actor through it, so it doesn’t matter. But once the gold visor goes up, I know I’ve got to tell a story with the clear visor, so you just shoot without it. If you leave it in, and there’s lights, and there’s camera crew, and there’s all this crap we’ve got to paint through, we don’t want to affect performances, so it’s easier to just go, ‘Let VFX make the clear one.’

b&a: In that scene, after the shooting, when you see inside the cosmonaut’s helmet, it is on fire. That’s a really shocking moment. How did you accomplish that?

Jay Redd: We talked a lot about, how would an astronaut die? It could just be a puncture wound, just from bleeding, but the idea came up from one of our writers to see a depressurization. The idea behind it is that the bullet goes through the Kevlar, pierces the fabric and hits a piece of metal and maybe one of the ventilators, and that sparks. Imagine you have oxygen in your suit—a spark in oxygen contained in a suit would just be an inferno internally.

On set, I worked with Ross Berryman, our DP, and the spacesuit costume people at Global Effects. We didn’t know exactly what the fire was going to look like, but I knew that I needed to have some interactive light. We lined the interior of the helmet with multicoloured LEDs, and then we programmed some flame lights, just a flicker of orange and yellow and white to get the movement of fire with the visor down, so you could actually feel it through the visor.

The first shot is pure fire, the second shot is where it starts to burn the inside a little bit, soot and smoke starts to build up to the point where the oxygen is spent and it dies down.

“That shot will never go away, for some reason.”

Zoic did these shots. I found a bunch of footage of kilns for ceramics and wood burning fire stoves, which had a gentle feel. I was like, ‘Oh, this doesn’t feel violent enough.’ I would just take things to my editing system and speed it up four or five times. It gave it this upward motion, like you would imagine a wood burning stove at its hottest point. That was a direction to Zoic—‘Here’s the kind of speed we want’—and then they did all this incredible tracking and matchmoving. There’s fire inside but then there’s all these reflections of the moon in the visors. It was very complicated work.

b&a: For some of the spacecraft launches or flight scenes, were you using any stock footage or doing any restoration or enhancement on stock?

Jay Redd: Yeah, that’s a great question. This show is just packed with stock, enhanced stock, full CG launches, mixtures of all that. We’d literally have to go scene by scene to break it down.

The weird thing about stock is that, this season takes place in the ’80s, and so in the ’80s, it’s a weird time because this is when cheap video came out, but then film was still being used. So, a lot of the shuttle launches are shot on pretty crappy video, and HD wasn’t a thing. So, you either had film, or three-quarter inch tape or beta.

In the 60s, it was better because they were shooting everything on film, and so you have all this brilliant footage, which First Man was using, and Apollo 11, the beautiful documentary, was using, and enhancing all that stuff to look just spectacular. But we had this conundrum of switching between film and/or video. Some of the video was so poor that we ended up doing some full CG launches of different ships.

We worked with a company trying to do enhancements at 4K, ie., a bunch of AI processing to try to make it sharper and better. We just couldn’t get it to cut with the interiors of the live action, it just didn’t feel right, so we ended up going full CG on some of those.

Then we also used stock footage that may have had the wrong name of a shuttle on it, so we’d replace the shuttle name. Sometimes we’d replace the sky to get it to cut in with other stock footage. A lot of our work in the show is changing stock or even doing deep fakes.

b&a: Yes, tell me about the deep fakes.

Jay Redd: This was about making politicians or famous people say things that they didn’t actually say, like Ronald Reagan. I’ve been following this company in Tel Aviv named Canny AI. I knew we needed to do these kinds of shots, so I contacted Canny AI saying, ‘Hey, we have some pretty bad quality newsreel footage, and I want to do some tests with you. How are we going to make this work, because it’s not high fidelity, it’s not HD, it’s really bad VHS transfers and then compressed into some crappy archive somewhere?’. It proved to be really challenging, honestly, to get the readability when you’re only dealing with like three scanlines on an image. But Canny AI was fantastic.

“Don’t be shy. Destroy it. Make the video bleed.”

We found a voice actor who sounds like Reagan—by the way, we did this for Johnny Carson too, who was our late night host, and also Gary Hart, who was a politician—and then, of course, we use their video as the source, and then we go through that whole process. And sometimes there’s even paint touch up on mouths to make it fit in a little bit more.

b&a: In episode 10, things really heat up, especially with the sequence involving the Pathfinder, Sea Dragon and Buran. How did you stage that in-space scene?

Jay Redd: Well, a challenge for me is I’ve learned more about orbital mechanics than I ever thought I would ever learn. You don’t steer a ship like you do a car on the Earth. We have propulsion, we have gravity, we have friction, we have mass, we have all these things that concurrently have to coincide together. It’s somewhat the same in space, but we don’t have air resistance, we don’t have aerodynamics. None of that is there. And so learning about why would you turn your ship around backwards and turn the engines on, I mean, it looks like you’re going to fly away! Well, that’s a braking burn to slow you down.

Also, it’s not a show where we can just fly and zoom in and go everywhere we want. We have more of that journalistic camera where we’re zooming in here and there a little bit. We try to not use it too much, but one of the challenges was just the choreography of, where is everything at a given time? They’re doing a standoff with the Buran, the Pathfinder, and then protecting Sea Dragon, and you always have the moon as a backdrop. And another challenge is that we’re on the unlit side of the moon, and that was part of the storytelling, that they’re away from communications from Earth, they can’t see each other, except on radar.

I worked with The Third Floor doing the previs for this. We actually started with literally some toys and talked about, ‘Where are things?’ And Sergio Mimica-Gezzan, our director, was really instrumental in all this of course. Ghost VFX worked on that entire sequence. I would give them some really rough sketches of the way I want things to look, and they’d come back with just awesome details on everything.

b&a: Finally, spoiler-alert, there’s that final pullback to Mars. What is the frame count on the final pullback and is it longer than the opening shot in Contact [a shot Redd worked on at Sony Pictures Imageworks], Jay?

Jay Redd: No, it’s not. So, that’s hilarious you asked that. It’s funny, because we used that a lot. That shot will never go away, for some reason. We used that shot to talk about this pullback shot at the end of season 2, particularly about just flares, and scale, and speed. Trying to stay optical as we could. Zoic did that shot.

b&a: Is it a live action plate from the cemetery? Did it require any kind of drone or helicopter plates?

Jay Redd: We shot that top down plate here on the West coast. We also involved stock footage from real funerals, from the Challenger disaster, intercut with our own stars. Again, very poor quality video from the archives. And so if we cut to our characters as if they were there, it didn’t look right. And so, we had these military burials with Reagan talking over just one coffin and we duplicated coffins—Crafty Apes did that work. They had to take our 4K footage and degrade it to match to the crappy video footage. I’d give them notes, saying things like, ‘Don’t be shy. Destroy it. Make the video bleed, change the color, make the distortion, add compression artefacts, just go for it.’ They did such a good job.

For the pullback, it was a big crane shot, but it’s just a little plot of rocks. So, we added the flame, put the headstones in, and with a 75 foot Technocrane we went as far as we could and then started taking it over and doing all the clouds and everything. Zoic did some terrific matte paintings around the area, 3D trees, and satellite based landscape photos, and then lots of really, really good matte painting and 2.5D work.

b&a: Is the Mars’ terrain realistic? I mean, is it based on a real location there?

Jay Redd: It’s based loosely on some early data from one of the Mars surveyors, just as far as rolling hills go. But all the details with rocks and everything are put in by design. We’re purposefully not saying exactly where we’re landing on Mars, because we didn’t want to say where we’re going to be for season 3.

“It’s not a show where we can just fly and zoom in and go everywhere we want.”

While we were making that, I was literally watching Perseverance land. It was so surreal and I got really emotional when it was happening. I had the JPL feed going while Perseverance was entering Mars’ atmosphere. They’re showing the 3D animation in real-time, and I’m listening to the mission commander, and it enters the atmosphere, and I’m hearing applause from JPL. This is all happening while I’m writing notes to Jeff and Zoic about the Mars landscape, and they’re showing video from this thing. And I’m like, ‘What is happening!? Is this real life right now?’

For the boots coming down, we shot a plate with boots coming in, but we ended up not liking the design of them. We didn’t want to get married to, ‘Whose boots are they? Is it the Russians? Is it US? Is it somebody else?’ So, we ended up going CG on the boots, because we wanted them to come in and have dust, and a lens flare coming by the leg.

b&a: Just finally, which were the different VFX studios you worked with?

Jay Redd: There was Method Montreal, Method Vancouver, Zoic, Crafty Apes, Ghost, Refuge, Hybride, Union, Barnstorm VFX, Studio8 and TeaspoonVFX, which is Todd Sheridan Perry’s company. Todd worked on season 1 with Method and we just got on really well, and I thought he had a great eye, so I reached out to him saying, ‘Season 2 is going to get big and I can’t be on set all day. Do you want to come down and be on-set supervisor with me on the days that I can’t work on set?’ And it was a great collaboration.

Become a befores & afters Patreon supporter.