So, how do you show a spacecraft launching from the ocean?

Crew members from Union VFX reveal how they did it in ‘For All Mankind.’

A post-credits sequence in the final episode of season one of Apple TV+’s For All Mankind depicts TV coverage of the sea launch of a large rocket, which is taking a plutonium payload to a lunar colony.

Working with production visual effects supervisor Jay Redd, Union VFX handled the rocket scene, ultimately delivering a 4K, 2544 frame single shot of the launch from under the ocean. It was based on a Sea Dragon rocket (never built for real but part of the show’s alternate version of the ‘space race’ in which the Russians beat the Americans to the first moon landing).

To find out more about how the launch was crafted, befores & afters asked several members of the Union VFX crew about the steps involved. These included taking original previs and generating water, fire and smoke sims as well as the rocket itself, in the process generating 4 billion voxels. The team also talks about utilizing cloud rendering for the work. Check out their run-downs, below, plus Union’s VFX breakdown of the sequence.

Simon Hughes, VFX supervisor: The was the most ambitious fully CG shot of this scale with this kind of FX simulation complexity that we’d ever done at Union. It was really exciting, but my challenge was breaking down the component parts into manageable bodies of work.

Each piece was codependent, so from a supervision point of view, I needed to coordinate the progression of the dependent parts at each stage before moving on to the next challenge. Given the time constraints and huge compute required to simulate, we needed to be as efficient as possible – we couldn’t afford to waste anything.

Communication was essential – regular and open, full team cooperation. We had to maintain focus on the end goal despite having to wait to see our results. We wanted it to be photoreal. To keep the momentum we worked with WIPs, elements and stock to give ourselves the best toolkit for when the sims were ready to incorporate. We needed to keep moving to give ourselves options. There was no time to sit and wait.

The rocket that was dreamed up, but never actually built, so being involved in bringing that to life was a privilege. Also working on the season finale shot of one of Apple TV+’s flagship launch programmes was pretty epic! All VFX teams like working on anything relating to sci-fi and space and it doesn’t get much better than this.

Tiago Faria, lead/senior compositor: Early on, I designed the camera in Nuke and passed it to the 3D team to enable us to get feedback on it as quickly as possible. This was essential given this FX heavy shot – particularly given the timeline. My role on the project was to composite all the elements provided by the FX, CG and DMP teams, filling in any gaps and enhancing the FX simulations using real elements to bring everything together.

My biggest challenge was placing all the 3D and real elements in the same scene ensuring they interacted with each other as they should – with CG layers, stock footage and DMP these quickly numbered into the hundreds. This was particularly challenging when creating the reflections and lighting interactions because it was one continuous shot more than a minute long. We had to smoothly introduce and phase out real elements without anywhere to hide. Their placement had to be really strategic and well-considered to get a believable end result.

Matt Moult, 3D artist: My biggest challenge was to convey the sheer scale of the rocket through modelling and texturing detail. It was a bit tricky given that much of the rocket is not much more than a large white cylinder with very little relatable detail to convey its scale. Initially, I tried to deal with the rocket mostly as one big object, but then found that breaking it into distinctly separate pieces and constantly referencing a human figure made it a little easier for me to convey the sense of scale. Texturing for ‘the wet look’ required a lot of testing to refine the shaders so they appeared ‘wet’ in the correct way.

I was also tasked with using the sim caches from FX to drive the animation of the buoys in the scene. They were generating huge, rapidly-changing chunks of data that represented the waves and whitewater etc. Not being an FX guy, my machine is a little more humble than theirs, so I employed several different methods to optimize the data to make it more usable for my needs.

Jan Guilfoyle, producer: The length of the shot (2544 frames) and the fact that it was 4K meant that the time required to sim and render was huge – even with cloud render resources at our disposal. As we got closer to the hard deadline, many of the decisions we had to make with regard to rendering were “do or die” in the sense that if it was the wrong choice there would be no time to go back and try again in a different way. It was also difficult to get visibility on how long the sims and renders would take – at one point we were waiting on 3 CG frames rendering on the cloud that sat at 99% for 8 hours before they finally came through!

The sim and render times required meant that what we were seeing in comp was always about 1-2 weeks behind where we actually were in terms of our CG. This meant it didn’t look anywhere close to a final shot until very close to delivery when it all started to really come together.

This would obviously have been concerning for the client when we were showing versions a week from delivery that still had low-res renders, but we were fortunate that Jay Redd, the series VFX Supervisor, was fully understanding of the complexity and scale of the shot and had faith in us and our process right through to the end.

Marcelo Sousa, FX artist: This project presented a lot of FX challenges. The delivery timeframe, the length of the shot, the 4K delivery, the combination of water, white water, fire and smoke and always a single camera on top of the action.

This was also our first time using the AWS cloud on a live project, so there were numerous teething problems. These factors resulted in us having a limited number of iterations before we had to deliver the final product. We had to be as efficient as possible to make every bite of the cherry count.

Matthew Aldridge, systems administrator: I was tasked with adapting our internal pipeline into an AWS cloud pipeline for this project. We had done a very limited proof of concept, but now had to scale this workflow to accommodate an FX shot of this scale and complexity under the pressure of a live project with a tight turnaround. AWS allowed us to go from 34 in house render nodes to 500 AWS instances at some points.

We learned a lot, but it was intense as we couldn’t afford to fail and there were definitely bumps in the road, however, it facilitated an ability to work at an unprecedented scale.

Leave a Reply