A breakdown of how the rig worked, with visual effects supervisor Jake Morrison.
A scene that many may remember from Thor: Ragnarok was the slow-motion Valkyrie flashback. That sequence benefited from a specialized multi-light lighting rig developed by Satellite Lab. The team returned with a new lighting rig for something just as visually spectacular on Taika Waititi’s follow-up, Thor: Love and Thunder, for the ‘Moon of Shame’ sequence.
This is where Thor (Chris Hemsworth), Jane Foster/Mighty Thor (Natalie Portman) and Valkyrie (Tessa Thompson) visit the Shadow Realm to confront Gorr (Christian Bale) and find themselves battling Gorr and shadow creatures on a tiny moon.
The distinctive moving light on the characters, and their oscillation between black and white and color, was helped made possible using the lighting rig, dubbed Platelight. It was set up to capture multiple strobe-like lighting set-ups all at once, using the same high-speed camera. Since each lighting ‘pass’ was recorded as effectively pieces of separate footage, these passes (ie. the lighting) could then be controlled in post. The final visual effects work for the sequence was handled by Method Studios, now Framestore, in Montreal.
Here, Love and Thunder visual effects supervisor Jake Morrison, who also worked on Ragnarok, explains more of the process for shooting this Shadow Realm sequence, and what the various lighting passes allowed the filmmakers to achieve. You can also listen to the befores & afters podcast on the tech with the Satellite Lab team.
Check out the end of this article, too, for a note on some earlier and ongoing research in the same area published by Paul Debevec and USC Institute for Creative Technologies.
First, the idea for a tiny moon
Jake Morrison (visual effects supervisor, Thor: Love and Thunder): We needed Gorr’s lair somewhere out there. He had to have a home base. And that was when the Moon of Shame suddenly came to be. It’s one of the things I’m most proud of. It’s up there with the Valkyrie flashback sequence from Thor: Ragnarok, in terms of being visually spectacular.
There are three or four different things running in that sequence, simultaneously, but we narratively walk the audience through them. The first thing is the sight gag of the boat going ‘Bang!’ into the moon, and you realize it’s a tiny moon. Then there’s a sideways shot of the boat falling over and the goats screaming, as they do. This show’s the ‘there’s no up in the universe’ aspect.
Then we’ve got what we called the hamster wheel shot, where they’re walking on top of the planet and the planet’s spinning below them. That was a shot that I put together with Taika a really long time ago in previs. The sum of those shots is to show the audience immediately that this is a weird place but still start to establish the visual rules. And that it’s a small moon!
‘We drained the planet of this particular type of light’
We shot with a progression on the lighting rig that we did for the last picture, the one from Satellite Lab that we used for Valkyrie’s flashback in Thor: Ragnarok. It’s a lighting rig they call Platelight that allows you to shoot your actors by lighting them normally, but you’re actually lighting them from six different angles. You’re lighting with a keylight and an accompanying fill for every single angle, but instead of them being continuous sources you turn these lights into what are effectively strobe lights.
Then, when you shoot your actors, you shoot with an incredibly high speed camera. But, instead of it being a single strobe light per frame, like we used in the Valkyrie flashback, you’re now using huge banks of 50 or 60 industrial level strobes, which are lights that are able to be controlled within milliseconds. I’m not kidding when I say this – we drained the planet of this particular type of light. At one point I had our gaffer, Reg Garside, saying, ‘I think I found two more in Iceland…’. So we gathered these lights, then built these massive six light banks all the way around the sound stage in Sydney at Fox Studios.
Kinda like AOVs
The lights strobe in sequence so fast that for the actors, or anybody on stage, it just looked normal. But what was happening in the raw footage was that you could see the change in lighting; if you went frame by frame you would see literally the same moment in time but from six different lighting positions, per frame. We called that an ‘undealt pack’. In post we would take that pack and we’d ‘deal it out’. You would be able to look at each light–light one, light two, light three, light four, etc as six different sets of footage. We ended up with something which is very similar to what we use in post production, which is AOVs, arbitrary output variables, but with live action for the first time ever.
This meant we were able to have these six different incident angles around our actors in the middle of an action scene. We wanted to communicate the weirdness of the moon, for gravity to behave strangely here. How you might do that normally is with moving lights during a scene. But in an action sequence, every time you try and do a moving light gag, you fail. And it fails because you try and shoot it, it all looks fine, but then when it gets to editorial, especially for a fight scene, it doesn’t cut together continuously as things are continuously tightened up and shots are re-ordered. Over time we’ve come to use less and less interactive dramatically moving light to avoid this, but I feel this is the reason that we’re starting to see a lot of homogenous films. This technology allowed us to be very, very bold visually.
We had the ability to take each of those lighting passes and then blend them one after another. It feels like wheeling light, but actually you’re just moving from one different lighting position to another one.
A challenge: making the backgrounds look interesting
We filmed this all on a little slab of pretend moon surface that was built on stage in Sydney. As the stage piece never changed I was worried it would mean all that lighting work wouldn’t look very interesting. But then we said, ‘Let’s wreck the planet to make it more visually interesting!.’ The minute we destroyed the planet it gave us a good excuse for all this floating, low-gravity debris, plus a bed of dust.
My pitch to Taika was to make something like car headlights in fog, but do the opposite of that. Let’s have everything be lit but ‘the car headlights’ that come through are actually shadows going through. It ended up being volumetric shadows-a-plenty that Method Studios in Montreal layered in there.
The camera we were using was the Phantom v2640 ONYX. It can go up to an insane shooting rate which helps make sure each lighting pass you’re shooting is as closely aligned in time as possible to the next one. You can get incredibly up there, speed-wise but, as with any physical production there’s a balance between the theoretical and the practical; the faster you shoot, the more light you need. We ended up settling on 576 frames a second, which is 24 by 24. Each pass is actually separated by a tiny delay which needs realigning using optical flow.
My grand plan on that one was something that I learned from years ago when I was working with ReelSmart Motion Blur when we couldn’t afford ‘posh’ renders. You had to render without motion blur, but you needed to do it later on. You would basically shoot a texture pass and then you would have all the information and use that to generate the motion vectors, and work the blur back into it from them. During the shoot, we did a registration pass, then took the different layers of lights and sync’d them up perfectly.
We had a process where the plates were grouped as six different lighting layers. Whenever VFX editorial put in a request for a particular plate, all six of them would come out and they’d all have sync’d frame numbers. That would go to the vendor, in this case, both Method Studios, all as one.
Shadow, black and white, and color
Another challenge was, we shot everything in color, so how do you use that in a black and white scene? A really interesting challenge. If you look closely, every time the heroes are in the ascendant, when they’ve got the upper hand–they get more color. And when they’ve got down beats, it gets more black and white.
I got so fed up with having to recreate CG versions of the characters to do all the effects lighting over the years, which we’ve been doing since the nineties. Sometimes it’s great and sometimes it’s not. You’ve got to have perfect registration to make the effects lighting believable. This is the first film that I think has ever been made where you’ve been able to dial that in using the real light that was on the actor on the day.
We could cherry-pick from the light that we had in the plate and key that in. Light’s additive, so you’re just adding that light in and sculpting it. It’s an incredibly expressive tool.
The time-multiplexed lighting enabled by PlateLight, as discussed above, is an area of significant earlier research conducted by Paul Debevec and others at USC ICT. In particular, the technical paper, Performance Relighting and Reflectance Transformation with Time-Multiplexed Illumination, was presented at SIGGRAPH 2005 (authors: Andreas Wenger, Andrew Gardner, Chris Tchou, Jonas Unger, Tim Hawkins, and Paul Debevec).
In addition, continued research at USC ICT into time-multiplexed illumination was part of the group’s 2006 ‘Light Stage 6 Relighting Human Locomotion’ work, 2010 ‘Comprehensive Facial Performance Capture’ work (which also related to a test for the film Gravity), and further 2019 research known as ‘The Relightables’ volumetric capture system done at Google.