The opening sequence of James Gray’s Ad Astra sees astronaut Major Roy McBride (Brad Pitt) venture outside the ‘International Space Antenna’ before it is hit by some kind of power surge, with catastrophic results. As parts of the antenna explode and crumble, McBride ends up plummeting back to Earth in a dazzling free-fall.
A visual effects-heavy scene like that often requires extensive planning, and here that’s exactly what happened. Halon, under the supervision of Clint Reagan, provided previsualization for that sequence and many others in the film. Then, after the live action elements were filmed, Halon came on board again to postvis the scene.
So, what is postvis exactly, and why was it necessary on this sequence? Casey Pyke, Halon’s postvis supervisor, breaks down the steps for befores & afters. But first, check out a clip from the final antenna scene.
Why is postvis necessary?
Casey Pyke: The postvis workflow is very similar to a VFX workflow. Basically, it’s the same thing except the purpose is a little different. VFX is for what the audience is going to see and postvis is for editorial, the studio, and those ‘judging’ the film, i.e. giving early notes on it. It’s also for all the people who see the film before a wide cinema audience sees it; in early screenings they often see postvis shots. So, we try to keep our quality bar high depending on the budgetary and time constraints.
Part of the editorial process
They had already been cutting the sequence together and they gave us an edit that had either black cards or only previs in it. Now, all of these shots needed a space background. On this movie, instead of shooting on bluescreen or greenscreen, they shot on black. I don’t know exactly what it was, possibly velvet or something that really absorbed the light and gave the background no bounce light, no green spill, or anything like that. So for this scene we would start rotoscoping the astronauts in the plates, removing wires and then putting in the backgrounds.
Generating CG elements
At Halon, we’ve been using Unreal Engine to render our previs. We start building a model and some textures in Photoshop or in Substance Painter. And then we do animation in Maya. For Earth, for example, we took an Earth texture, generated a cloud layer, generated a glowing atmosphere, and we tried to accurately display what Earth would look like from the height that the movie wanted the antenna to be at.
All that stuff starts in Maya, but pretty quickly moves into the Unreal Engine, which has a lot of power, as far as rendering goes, but also can generate a look that is way more high quality. It’s not exactly accurate to film, but it gets you that look that is a lot closer.
Unreal is also able to handle much higher resolution textures without killing your render time. So, we were able to push the quality really heavily using Unreal Engine and display our assets the way we want them to be displayed, rather than a compromise by just playblasting out of Maya, which is the traditional workflow that some studios use.
We didn’t need to do much digi-double work because a lot of the fall was achieved in-camera by putting Brad or a stunt double on a rig and spinning him around. They did some stuff on a crane, where he was hung from wires, so they could have a real sky in the background for when he falls into Earth’s atmosphere.
FX in postvis
There are a number of explosions on the antenna once the surge hits. The previs team did a version of those effects, which we took as a starting point and then, from director notes, refined them. Unreal has a particle effect generating system called Cascade, and is really powerful and versatile. We generated different aspects of the explosion; the fiery explosion aspect and the falling sparks that are the big feature of those. In Cascade, you can add modules, say a flashing light or a module that is a bunch of sparks flying out, or a module that’s some smoke.
You just build them up in Unreal and customize them there. Then you place them in your shot and animate them in Unreal to go off when you want them to go off. You can get them to collide with things in the environment. For the large part, for postvis and previs, you can get away with things that don’t necessarily interact with your environment.
Changes from the previs
Scenes are always evolving. We were matching something very close to the previs sequence. But, they had changes for us. They wanted the location on the antenna to be a little bit different or for us to improve the look of the antenna for the postvis. That included the astronaut digi-doubles that we animated or used. We have a mocap library and used a little bit of that for somebody falling through the air. And we put that on our digi-doubles and animated them falling off during the explosion.
There was a landing shot where they hadn’t shot him hitting the ground. It was intended to be a POV from his helmet cam of when Brad Pitt’s character hits the ground. That’s a shot we had to do in postvis, it was all digital. A digital environment, a digital Brad Pitt and all that animation that came along with animating him parachuting in for a landing and rolling on the ground.
Matching the live action cinematography in postvis
Hoyte van Hoytema shot the film, and in postvis we would add in things like lens fares and chromatic aberrations to match what he was going for. Sometimes, you’ll have a greenscreen and there’ll be a lens flare, but by the time you’re done keying it, it’s gone. Or there might be just a bunch of film equipment in the background. By the time you’ve cleaned up the plate to be able to put a background behind it, you’ve lost the lens flares or chromatic aberration or the glow that comes right before a lens flare when the sun gets really close to the camera. You may have lost that stuff, so then you have to regenerate it to be present, to try to get the same kind of look.
Become a befores & afters Patreon for bonus VFX content
I am the big Fan of Brad Pitt. I have written a blog on Brad Pitt you must read in the blog i have cover all the information about Brad Pitt