From early animation to virtual cinematography to final shots
With the virtual production approach utilized in making Jon Favreau’s The Lion King, one thing I’ve been interested in asking the filmmakers is how those first steps of production on the virtual stage took place, i.e., how early animation was done to then enable the virtual cinematography to take place, and then how things kept iterating.- all the way through to MPC’s final visual effects work.
I got an opportunity to discuss the workflow more with production visual effects supervisor Rob Legato, animation supervisor Andy Jones and MPC visual effects supervisor Elliot Newman at a special Disney VFX day this week.
We talked about the process of early environment and animatics construction, using the virtual filmmaking tools developed by Magnopus, working with DP Caleb Deschanel, and then MPC’s completely CG shots. Their whole intention, of course, was to treat production as much as a live-action film as possible, even though of course the final result is essentially completely animated.
b&a: Andy, what were animating in the very early stages?
Andy Jones (animation supervisor): The first stage was a storyboard animatic by Dave Lowery, our head of story, who would work with Jon Favreau and create these kind of beats with storyboards, kind of like a traditional animated film. Then that would be given to us, once Jon’s happy with the timing of that and the feeling. We would go in and animate the entire thing, not board by board. but we’d animate the whole thing as one long beat so that could be shot from any angle.

Caleb Deschanel and Rob Legato. Photo by: Michael Legato.
Once Jon was happy with that, it would go to the Vcam stage and they would shoot it. There might be some moments that aren’t working, as well, or we need to explore more. So we’d go in and animate just a specific part of it, and go reshoot it again and do a re-edit. Then it might get tweaked some more. Then it would go to MPC as a finished cut scene.
Those animators in the early stage were all hired by MPC. It was an in-house MPC team, that was in LA, working with me in LA. Then I went to London once the entire film was kind of shot. Then we’d need to do final animation.
b&a: Elliot, what was it MPC ‘received’ once that virtual production process had taken place?
Elliot Newman (visual effects supervisor, MPC): The cool thing about this production was because we were involved in the shoots, we were able to develop a pipeline that allowed us to ingest all of the data from the shoot. So you’d have the character rigs. You’d have the actual camera moves. You’d have the lighting information. All that was 3D data assets that we’d ingest. Then we wrote some tools to help translate that into production scenes, which would be the high-resolution character rigs. Your ‘version one’ layout would effectively be that: it would be the 3D scene that they shot with. Then the cool thing, as well, with this medium is that it allowed you to if you needed to, go back to the stage as well. So you could go backwards if you needed.
If, for whatever reason, we were working on a scene and they wanted to change the camera angle, because it wasn’t working, or for whatever reason, you could actually convert that back into the shoot scene and then shoot it again. Then you’d work that same process again. Then adjust it back, because you weren’t locked into a plate that was shot.
b&a: Can you give an example of that, Rob, where a scene came back for virtual production?
Rob Legato (production visual effects supervisor): Well, I mean, if you’re directing a film, you’re always directing the film. You shoot it, you edit it, you do new ADR lines, you’re always directing. Even in the DI, you’re still altering stuff. Now, what we started with was animation that was not beat for beat – all the nuances of it – it was the sort of the rough caricature of it, first, just because it was so much to animate.

Caleb Deschanel
Then when we would shoot it, it starts to become more specific about what needs to happen. It would change, say, because they would move a little faster or a little slower. Our camera move on stage would not necessarily translate, because what we were photographing got altered.
So a lot of times, we would go back, once the animation was finalized, and just touch up the camera a little bit, to give it the same off-handed flavor, but now be specific to that particular beat. Some things are fixed, just on the box. But for a lot of things, to get the same quality of what the off-handed nature of shooting was, we would go back and touch up the scene. It had that sensibility of iterating just like you can do in a live-action film.
Become a befores & afters Patreon for bonus VFX content