VFX supervisor Robert Legato breaks down how the iconic scene was filmed, virtually.
Jon Favreau’s re-imagining of The Lion King took on a whole new filmmaking paradigm. The final result was a huge visual effects and animation effort, and it also involved a heavily reliance on virtual production on a virtual stage with real-time and VR tools, all aimed at bringing to the film a ‘live-action’ look and feel, as if a camera crew had gone out to shoot the movie in Africa.
To highlight that process, befores & afters spoke with visual effects supervisor Robert Legato about the process used to ‘shoot’ the ‘Everything the light touches’ scene in the film – that moment when Mufasa takes Simba up to the top of Pride Rock for a look upon the kingdom. This includes shots of them from the front, behind, as well as a circular move around them.
Here’s how Legato worked with cinematographer Caleb Deschanel, virtual production studio Magnopus (responsible for implementing software and hardware solutions on the virtual stage), visual effects studio MPC (involved in the entire virtual production and VFX process), and the many, many artists on The Lion King, to realize that scene. This was done on a virtual stage – a volume in Playa Vista, California filled with real-world film equipment ‘bolted’ into a virtual production system.
What the ‘Everything the light touches’ shot required
Robert Legato (visual effects supervisor, The Lion King): The concept of the scene itself is that you’re going to explore a 360 degree view of the landscape, and the landscape is in essence the star of the shot, because that’s the point of the scene – everything that the light touches.
It’s an incredibly simple staging, compared to a lot of the film. In that particular scene they climb up and they’re roughly in the same spot for almost the entire moment, so everything that happens around them is camera, and lighting, and storytelling in terms of look at the vastness of what you see. So in terms of what you wanted – it’s also an intimate moment between them – so for camera moves, I didn’t want to make it look like a kind of a ‘Tony Scott’ circular helicopter shot that never ends.
At the very beginning, Jon and Caleb and I conferred about what it is that we were trying to achieve, and it was pretty early on in our process and Caleb was still getting his feet wet in terms of, well, how do I do a Technocrane shot, and how I do I set up some of the physical mechanics of attaching a circular dolly to a straight track, and how does that compute?
Getting your head around virtual cinematography
It’s not that obvious unless you are familiar with this sort of work that, well, you know, circular track goes in a circle, but a straight track, how does that go in a circle? Well, it’s actually that we’re using it to create a spline and we are now pushing the dolly on the spline, so we still have the hand operated look, and then we have to build a certain apparatus in the computer that’s just like a circular track, and it’s always going to aim at the middle of the shot. In our particular case, unless you have this constraint built into it, it doesn’t quite act like a circular track with a camera on it, because if you set it up with a dolly in a circular track and you’re aiming at the centre, you will always aim at the centre. In this particular case it doesn’t.
On the stage, we wanted to do a real dolly, wanted to do real crane, wanted to do Steadicam.
So at first, Caleb was working away and got a little confused, because it’s doesn’t behave like real life, but we have to make it behave like real life, so every shot became like, well how do I get our gear to imitate that exactly so it feels exactly like the circular dolly track without actually having to build one? And so the net result is the same. So at first, so we all conferred about that, and then became like, okay, here’s the jumping off point.
Making the virtual production tools
Magnopus [under virtual production supervisor Ben Grossmann] led the development of this on the front-end, with MPC [under visual effects supervisor Adam Valdez] having pre-built environments and done stand-in animation, which was fed into Unity. On the stage, we wanted to do a real dolly, wanted to do real crane, wanted to do Steadicam. So we had the OptiTrack sensors there on the virtual stage in the ceiling that gave us this volume to enable live-tracking.
If we wanted to do hand-held, we’d be running with the camera kind of Pogo-cam-type stuff. So we had a laundry list of things we wanted to do, and then you have essentially the toolkit to do whatever you want with it. We’d make up some physical items to create the proper weight, so, for example, we’d create a Pogo-cam that had a weighted bottom so it doesn’t move quite the same way as just a pure handheld.
So, you basically can create a Steadicam of sorts with this various equipment that we could glue together. Also, we found part and parcel of what makes the Steadicam shots feel like a real Steadicam shot is if a real Steadicam operator filmed the shots. There is a particular flavour of how you can move with this particular device.
The actual devices themselves are the most appropriate way of figuring out what the shot is, just by their mechanical nature. I mean, you really can’t improve on how we photographed movies for a hundred years. We would, of course, plot camera moves by keyframing in the computer, but often every time we did we’d spend all this time trying to get it right and then, I’d go, ‘Screw it, I’m just going to take it back to stage and I’m going to shoot it in five minutes, because it has all the stuff that automatically makes the shot work, because I’m able to view it, judge it, shoot generally not in one take, but maybe five or four.
We found what makes the Steadicam shots feel like a real Steadicam shot is if a real Steadicam operator filmed the shots.
And that subtlety, that ‘go a beat faster here, go a beat slower here,’ all of that is music, literally you’re composing a feeling that your brain says, ‘Okay, I got it,’ when you see the appropriate take, and you print that take, and it goes in the movie. And it’s so different than the plotting the CG version of it, which has no life.
The other point I’ll make about about the live camera operator behind the camera – which I discovered really more on this film than I ever thought – is that intellectually I did not really think of it this way, is they are the audience. When you view what they’re looking through through that eyepiece, they’re the audience, too, and we’re slightly adjusting the frame, and we’re slightly panning and tilting, because our attention span is making us want to do it.
The ‘on-set’ experience
We’re all there together at the beginning. At the point of working on the Everything the light touches scene, we had shot some other footage like the stampede that needed some work, so Jon was in the editing room, and he had a live video tap to our stage, so he was always seeing what we’re doing anyway.
Where the real camera would be
We didn’t really want to be at an impossible camera angle, so we kept it in the realm of, well, if we were shooting it and we could shoot it to any way we liked, we would probably have the Technocrane on one portion of the mountain and be able to do a portion of the circular thing, then re-mount the crane and the camera to another fulcrum point and do the other portion, remount it again, and do another portions.
What they ended up shooting
We did shoot that kind of Technocrane shot, but then we went back and re-shot it again, and I did what felt like a helicopter shot where we were not banking so much, because we didn’t want to see that. So we thought, if you had a Spacecam and you were doing a circular pattern around it, we would do it like that.
We made a big giant wide circle that we could hand operate, and then Caleb was free to pan and tilt, and, sometimes, when we went a little too far, we’d have to correct it, and all those little corrections suggest that it was naturally photographed, or photographed in real life and not a computer generated camera move. Plus, it was a softer moment and not an action shot type thing, and then we’d do things like as, how much do we pick up the focus, when do you rack focus, and things like that.
Pre-building the environment
To enable us to do this with the virtual camera and to see it in VR, there was a massive pre-build of the environment. That was led by the production designer production designer James Chinlund and MPC. And a lot of that was based on a shoot we did in Africa, and then designing the sets and these vistas. We’d then ‘location scout it’, and make adjustments. We’d say, ‘We should put the water here, put some trees over here, move that stuff.’
Re-viewing and re-visiting shots
Caleb and I, while Jon was in the editing room, would shoot set-up after set-up with different lenses and different focal lensing, and then the edit kind of tells you as we’re trying to discover what the scene should be. Like, for this circular move with a helicopter, I would add the crane to the camera that’s circling around so I could be doing this resting, moving up a foot or two, or moving down a foot or two, moving up – all that, so it felt much more helicopter oriented where it was kind of slightly changing altitude, say.
Viewing all this in the VR goggles was almost like being in a helicopter.
And when you didn’t have it, if you noticed that it was perfect, or you either went too far and had to tuck it back and not be so aggressive going up and down, or it’s not obvious at all and you need to exaggerate it in various spots. So, we were feeling our way through the sequence, and we shot it over probably a couple of days, and then I went back in and we made it to more of a helicopter platform than a Technocrane platform like we had before.
Scouting in VR
Viewing all this in the VR goggles was almost kind of like being in a helicopter. There’s kind of a visceral sensation as you fly around, and things tend to look pretty cool through the prism of a helicopter as it circles around. At one point we were considering really free-forming it by one of us flying around with the camera attached to it and Caleb operating it, because we did get some natural sort of a flight pattern. The thing sort of reminded you of what a real helicopter shot is, because you could be in goggles, you have a visceral quality of what the ground looks like, how you’re circling, what the background was moving on, and so it does give you something that is a lot more reminiscent of what it would’ve been like if you filmed it for real than just looking through a portal in a virtual camera.
Then, being in VR also helped with location scouting and to be able to point out as we’re looking for where the shot is, we could fly down and say, you know what, at this angle, that river is just too narrow for us to really appreciate it, so let’s exaggerate it. So we’re able to, as if it’s a miniature almost, look at it from the appropriate vantage point, feel what it feels like to move around and helicopter, and also fly down and point out specifics, like from this angle, when you look at this, it foreshortens so much that it doesn’t give us the graphic quality that we want, so let’s move that.
Even when we shot the closeups, I went back in and added the hand-held move to the close-ups a little bit to make it less perfect. We’re always trying to make it not perfect, but not bad. There’s a really fine line between looking like just generic hand-held moves.
The right sun
We auditioned a bunch of skies.
It took a while in shooting, and quite frankly it took a while in post, too, to get the way the sun really wants to scratch the ground, and make it look like right. Jon’s kind of more adverse to beautiful sunrises and sunsets then Caleb and I are, so it was a delicate balance of getting just the right thing that pleased Jon and pleased us at the same time and served the story.
The right sky
Quite frankly, the sky means everything in these shots, and in Jon’s world, he really doesn’t like overly dramatic skies, but yet it still has to be dramatic enough to warrant why you would frame it this way, and occupies a lot of the screen area, so it has to be something, and it has to be something that suggest a time of day.
So we auditioned a bunch of skies and were able to, in VR, just spin them around until we found the sky that we liked, and we then imitated that when we did the shot. Then, we would change the sky even when we did reverses or other angles, much like you would if you were shooting it. In a real shoot, you might wait for a different time of day or shoot part of the scene, a sunset compared to sunrise, because that might have given you a little bit more of a glow where a glow would not have been if you were to do a dead reverse on sunrise. All of it still has to look like we didn’t shoot that much, i.e., that it looks like it’s a normal film version of a cheat and not a computer you can do anything you want cheat.
Rob Legato will be speaking at the VIEW Conference in Turin, which runs from 21 to 25 October.
This week at befores & afters is #realtimewrapup week, with reports on real-time tech and virtual production, direct from SIGGRAPH 2019. There’s also more Lion King coverage coming, specifically on MPC’s environments work and focusing on the character Pumbaa.