Cloth sims for an alien entity, making the sky one giant visual effect, and a rampaging chimpanzee

August 31, 2022
Final shot.

Nope contains some of the most eclectic, and beautiful, visual effects you’ll see this year.

How’s this for a set of visual effects challenges: an alien entity dubbed Jean Jacket that ‘unfolds’ and is driven by the wind, computer-generated and art-directed clouds, dust simulations that match helicopter-created dust wake, a CG chimpanzee, and a unique approach to day-for-night shooting with infrared.

These were just some of the tasks set for the VFX team led by MPC on Jordan Peele’s Nope, a movie that was also shot on both IMAX film cameras and 65 mm. Here, visual effects supervisor Guillaume Rocheron, who hails from MPC, describes the extensive work in the film for befores & afters.

The design of Jean Jacket started with its final form first

Guillaume Rocheron: It was a really interesting design process. Jean Jacket, the alien entity, transforms, which meant that we actually designed it the other way around. That is, we started with the final “transformed” form instead of starting from its initial shape. When Jordan was writing the script, he was toying with all these ideas and then we started to work on the design. If we’re talking about a wind-like entity and has all these characteristics, how do you visualize that? It was a very abstract thing for him even to write in the script. As he was writing, we were designing. It was feeding some of the ideas. It’s one thing to write, ‘It’s a wind entity that can do a lot of stuff,’ but also you have to ground it in some realism.

Plate.
Final shot of Jean Jacket in ‘unfolded’ form.

For its closed shape, we looked at a lot of the classic movies, like The Day The Earth Stood Still and Mars Attacks!, just to find the most classic saucers, but not necessarily with more sophisticated designs. Jordan was really trying to capture that iconography, and then say, ‘Okay, but what does it look like when we unfold? Is there anything we need to learn from its unfolded shape that needs to inform how we’re going to design the saucer?’

Neon Genesis Evangelion as inspiration

In our first brainstorming about it, we actually talked about Michael Myers and the mask from Halloween, in terms of how you perceive this thing in the clouds and not really having a face. We then very quickly started to geek out quite a lot on Neon Genesis Evangelion. Jordan happens to be a big fan and I happen to be one, too.

It’s the minimalism of Evangelion that we really admired. The design serves the function. When you look at the Angels, it’s like they have a purpose or a function or a way to operate and a design strictly tailored to just do that. So we started a few rounds of designs on this and then very quickly we came in with a very Evangelion-esque alien entity that looked like he was an origami, at the same time, a very simple design. I think Jordan fell in love with it.

So we said, ‘This is the language,’ and then we started to design the saucer. We then knew we wanted these very simplistic shapes, very little textures, very little features, very much ‘not much.’

A whole cloth sim

For the saucer shape itself, we always wanted to take inspiration from nature. We looked at sand dollars, for example, that have that round silhouette, but not perfectly round. We used that as a guide. The skin of it was very much just minimalistic, which we usually don’t like in visual effects very much! When someone tells you, ‘Hey, I want something with no features,’ you’re just like, ‘Okay, that’s great, but how do you make this feel real somehow?’

Plate.
Helicopter wash reference.
FX sims.
Final shot.

Usually we like to just scatter a ton of details on CG objects, because it helps your eye and your brain–you understand the scale. It’s funny because we’ve spent many, many years developing all sorts of beautiful muscle systems and skin deformation and things like this and then suddenly it’s like, ‘Well, it’s an entity that literally will be made entirely of cloth sims.’ And it is literally just a whole cloth sim. Anytime we tried to put some textures on it, it was, ‘Yeah, it looks a little too creature-like, in a way.’ So it always went back to this idea of minimalism with blank features.

At first, we designed it with almost tentacles at the bottom that were flat, but flowing in the wind. In one of our meetings, we said, ‘Well, what if it’s a skirt, like Marilyn Monroe’s skirt? Just something that really flows.’ Jordan loved the idea, because of the theme of the movie, too. Suddenly, we started to abandon any sort of literal appendage and features. We just went more and more simplistic, up to the point where it was like, ‘Well, the only way to give it some details is through cloth simulations.’ Its function is just to flow in the wind.

Jellyfish-like

Jordan really wanted an entity of the wind, so after the initial cosmetic design we thought, ‘We should talk to real scientists, people who know aerodynamics and biology and fluid dynamics and all these things.’ We connected with the guys at JPL. We talked a lot about ion propulsion. We connected with Professor Dabiri, who is at Caltech. He’s an expert in fluid dynamics and jellyfish. He has a jellyfish lab at Caltech, not that we wanted to make our creature look like a jellyfish. If anything, we actually tried to pull away as much as possible from the ‘jelly’ feel, but there was some fascinating facts about it, such as that a jellyfish is basically all efficiency. It’s the most efficient animal in the ocean.

That informed us as to how the saucer would move. We thought of it as an incredibly light surface that rides wind currents. It basically just captures the different flows and the different currents. When you see it unfold, it’s literally just a center structure with a brain and then some hoist or ropes that basically connect the brain to the sail and the sail is made to control the speed. The skirt is controlling its flotation. We went in and just tailored the design, even through the animation process, to just be like, ‘Okay, everything is functional. Everything needs to have a function.’

The animation was quite simplistic. We could control the tension of the hoist and the amount of unfolding and the basic shapes, but you always had to imagine how you would translate into, later on, a massive cloth simulation. Iterations would take quite a long time, because this is such a massive object. When it’s unfolded, it’s almost 480 feet wide. It’s a lot of fabric to simulate.

The sky: one big visual effect

At one point Jordan and I looked at each other and said, ‘You know what? The sky is going to be the biggest challenge of the movie. It’s not even the alien entity, it’s going to be the sky.’ The sky in this movie has to be the ocean in Jaws, basically. You have to use it to create suspense and to showcase things in a slightly different way.

Cloud R&D.

Basically, the skies in the movie are all visual effects. There’s two shots with real skies in the film, because every time, we wanted to art direct the skies. The clouds and the animation and the staging of it all were very connected. The skies were literally all sets, basically all-CG sets. We filmed everything on location, but the skies were all-CG sets.

For this, we designed a system where we generated large sky boxes made of geometry that would be an approximation of clouds that could then pass through our previs team and our animation team. You could literally compose a cloudscape by moving geometry around. We treated it the same as you would if you were designing a CG jungle or any CG set for that matter.

Further cloud research.

We would compose the cloudscapes and animate them and time them for every sequence. We worked on the sequence level instead of shot level, where the same cloudscape would drift throughout the whole scene and then we just added or removed the clouds that we needed.

In post, MPC spent quite a long time on the skies, taking that initial geometry and then re-simulating volumes based on that. Then, because some of the movie was shot in IMAX and fluid simulations at that resolution are pretty crazy, there was a whole process of rescattering volumes with volumes to create subjects or details into the clouds.

Shooting plates for clouds

The way we shot a plate was, we’d do the previs, and then we exported all the sequences into Unreal Engine. That gave us a basic representation of volumes in Unreal. It was before Unreal 5. We had our own version of a cloud system that imported the geometry and then converted them into volume slices. On set, we made an iPad version of that whole Unreal scene. The clouds were converted back to geometry, because the iPad can’t run Volumes and Unreal at the same time.

Plate.
Layout.
Final shot.

I had all the encounters on an app that MPC developed called VPad, which lets you have fully animated sequences through AR. We have the moving cloud, the moving saucer. I had it all mapped onto the location, and then we mapped the IMAX lenses to the lens of the iPad, so we could say, ‘Alright, let’s go on the 50mm IMAX,’ and I would be sitting with Jordan and the DOP Hoyte van Hoytema and we would just look at the shot on the iPad in AR. We’d be like, ‘Alright, that’s what you see. That’s where Jean Jacket goes…’.

I call it the poor man’s virtual production, because we didn’t deploy something that was tracking the main camera. Remember, we shot on 65 mm film and we shot on IMAX cameras and on location, so it would have been a bit setup to have all that tracking done.

One thing to note is that no matter what we did with the ship in the clouds, we always tried to somehow obscure it and reveal it slowly. Having your audience imagine what’s happening there is very powerful. I think growing up in the ’80s was, for me, well, this is how I experienced a lot of the movies that I was watching growing up. I think for Jordan it was the same thing.

Helicopter wash and MPC simulations

For the dust effects when Jean Jacket is picking people and things up from the ground, we were always looking for something to help with interaction. We always tried to shoot something practical that we would then augment. For the dust, because the saucer is moving so fast and the scale is so big, we basically just used a helicopter as a dust machine, basically as a giant fan.

Plate.
Helicopter reference.
FX simulation.
Final shot.

We had a helicopter that was not actually there to film anything, but that was literally just there as a gigantic fan to blow a huge amount of dust and travel as fast as it could on the ground. It was our foundation for everything. I mean, we just got dusted on every day for weeks and weeks and weeks when we’re shooting. That was really our way to ground the camera work and see how the lighting looked in the dust, how the lighting changes in the dust. You just get so much out of it.

Then, when you start to put your CG elements into it, there’s nowhere to hide. You just have to be as photographic as possible. I think that connects the saucer to the ground, which means it connects it to the audience. It was always our mission, to try to somehow connect it to the ground.

Bringing Gordy to life

Terry Notary played Gordy, the chimpanzee, on set and was just incredible. It was really all about trying to create a scene that felt scary and very disturbing, while also obscuring things in a similar fashion to what we did with the cloud. It was like, ‘Well, what if we use the same idea, that you’ll witness it through the point of view of our character, which is little Jupe?’ We are always from his point of view, the same way as we’re always from the point of view of our characters when they are looking at Jean Jacket. Here it’s from under a table and there’s a very transparent silk table cloth there.

Terry Notary on set.
Shot in progress.
Final shot.

You’re looking at the chimp that is going on a rampage, but somehow because it’s a little obscured and it’s through the silk fibers of the cloth everything gets a bit more sedated. Obviously, it makes life much harder because now you have to suddenly render a table cloth with all these fibers that threads the light in the same way and filters Gordy in the same way.

For the performance with Terry, we actually built an oversized set. We made it 30% bigger, so when Terry was standing in the set next to the sofa, he would be at chimp scale, basically. We had Terry in the oversized set, and we put him in full Gordy clothes with blood, with everything, and he could interact with anything that was on the set at the right scale. Even the little girl, Mary Jo, who is a teenager in the show, we found a 6’2″ stunt woman who was basically 30% taller than her and put her there. Terry could climb on the sofa and the interactions would be right. He’d leave a trail of blood on it and then interact with the table cloth, touch the foot of the little girl, everything was at the right size.

Gordy CG development.

We then had seven witness cameras around the set that we could hide and we ‘faux cap’d’ based on that. We really wanted to get the performance from Terry on this and the interactions. Once we had a first pass of automation based on all the witness cams, then came the work of adapting the performance to chimp features. For example, chimps don’t move their faces the same way as humans. Then, obviously, the fur and the cloth and the sims and the blood and all these things were VFX we had to do.

IMAX, infrared and day-for-night

The night scenes were ones where Hoyte used a combo ALEXA 65 infrared and Panavision System 65 infrared rig shooting day-for-night. All the night shots were visual effects shots. To work out what we’d need to put in there, we all went to the location in the middle of the night. We turned off all the cars and all the lights and suddenly we were in pitch black darkness. After a minute, you start to see shapes. After two minutes, you start to see colors, and then suddenly you start to see the sky and things that are very far away. We said, ‘Okay, we want to create nights that are immersive like this.’

Plate (color).
Plate (infrared).
Layout.
Final shot.

We had considered the traditional day-for-night techniques. The problem with the traditional day-for-night techniques is that all you can really do is manipulate something that is a daytime frame, give it a bit of night color and you just have a lot of things to deal with. Hoyte had used a bit of infrared to create dark skies on the moon for Ad Astra. We started to think about, ‘Well, how about we use this approach to actually create nights?’

What infrared does is it makes blue skies become black skies. Somehow, the contrasts that you see are more akin to how you see at night. It doesn’t really work on faces, especially on eyes. For example, the eyes look a bit strange. There’s a few things that don’t react too well to infrared, but overall it gives you an image that is already fairly balanced for nighttime. The only problem is, obviously, it’s black and white. That’s where the 3D rig came in. We built a sync box to synchronize an Alexa 65 that was modified for infrared and then the Panavision 65mm film camera. We synchronize them and synchronize the shutters and then we would line them up with lasers. The size of the sensor is more or less the same. There’s a 5% difference. We managed to realign that.

Plate (color).
Plate (infrared).
Layout.
Lighting.
Final shot.

What it gives you is two images for every shot–we have the infrared one and then the color one. We would colorize the infrared footage and then we would run those shots through match move to extract a depth pass for the whole thing. From the depth pass, we then could modulate the visibility and how you silhouette things. It’s a lot of very intricate steps. So, first, we align things together, but also we then treat the faces, treat the eyes, treat the depth and then treat the colors. It doesn’t feel black and white, but it doesn’t feel like classic ‘blue’ movie nights. We just wanted to find that in-between.

We had to do a lot of testing for production. It was a big leap of faith, because we were initially shooting some of those night scenes during the day in the Californian heat and it’s 40 degrees centigrade. We’d look at each other and say, ‘It better work, this night thing,’ while we were shooting literally at 12noon. But it was pretty satisfying to get something that looks like a night the way the eye sees and that doesn’t look like a grade or a visual effect.

Inflatable tube men: mostly real

For those inflatable tube men, they were pretty much all there. We did just a few in VFX. They designed an incredible radio control system where you had a hundred of those. It could be programmed with patterns and waves or it could be be like, ‘Turn on B52.’ Our gaffer just did an insane job at rigging that thing. It was amazing, actually, and very surreal. It’s almost peaceful. You walk into the valley on the morning of the shoot and you have all these inflatable tubes. It’s like, ‘Ah, that’s kind of nice.’


Subscribe (for FREE) to the VFX newsletter




Leave a Reply

Discover more from befores & afters

Subscribe now to keep reading and get access to the full archive.

Continue reading

Don't Miss

Watch MPC’s VFX breakdown reel for ‘Spaceman’

Includes shots in space and the spider creature.

Effects so convincing, Ridley Scott thought they had killed a stunt performer

A new befores & afters podcast episode with the VFX