New approaches to virtual cinematography, the use of Unreal Engine, and understanding the film language of the filmmakers. An excerpt from issue #40 of befores & afters.
With Mufasa: The Lion King, MPC had, of course, previously delivered an almost entirely CG animated feature and one that heavily involved virtual production methods with Jon Favreau’s The Lion King (2019). On that film, Barry St. John had also served as production visual effects producer. Mufasa would tread some similar ground, approach-wise, while also offering up a whole new set of challenges.
“Mufasa was much different from The Lion King,” outlines St. John. “It was an original story the director was trying to find within the script and on set while we were making it. The Lion King was re-telling a story we already knew, so the story was rarely something in question, and that allowed us to be efficient in the storytelling aspects of the work created.
“The Lion King produced 125 minutes of virtual production content for a 107-minute film, while Mufasa produced 210 minutes of virtual production content of a 116-minute film,” adds St. John. “Mufasa forced the virtual production team to find a more efficient way to layout scenes. We created a motion capture toolset that utilized a bi-pedal human controlling a quadruped lion in the game engine. This allowed the director a real-time visual in blocking out their scenes without relying on a first pass of keyframe animation. This saved anywhere from three to six weeks per scene.”

But why virtual production? Why did the filmmakers and VFX teams lean so heavily into these techniques? St. John says “virtual production allows filmmakers to layout a movie utilizing a toolset that a filmmaker would be familiar with. Laying out a film in this lo-fi, or low budget, manner allows the filmmakers and studio to understand what movie they are making. Often large films rely on storyboards and traditional filmmaking tools to piece together their first draft of a film which leaves a lot to the imagination of the audience. Virtual production tools allow the first draft of the movie to truly be a layout of the film they are making.”
Adam Valdez had been MPC’s visual effects supervisor on The Lion King. Going into Mufasa, Valdez took with him several lessons he had learned on the Jon Favreau film, and on other virtual production projects. It was important to the visual effects supervisor that the filmmakers could feel comfortable in tackling the storytelling of the film using the latest in VFX and virtual production tools, but with live-action sensibilities (plus, this planning phase of the production took place during the COVID-19 pandemic, when filmmakers and artists were essentially all working remotely).
“The main thing is making a technical process conducive to creativity,” observes Valdez. “In particular, making the steps of the process feel in some small way what it’s like to be in live-action workflows. This is because usually on these kinds of projects the whole filmmaking team is coming from live-action. I had also just completed shooting a bunch of episodes of Prehistoric Planet in my house during COVID. I had a small team I knew well, and an iPad, and that was about it. I knew from that experience that you don’t need a lot, just a tailored workflow that doesn’t get in your way.”

Could the visual effects team do this for a big movie with the big teams that usually surround major filmmakers? Yes, notes Valdez. Still, he says, “it’s always a really big process for live-action folks to make computer graphics for the first time. The limits, the steps, the delays, the rough look as you work—it’s not automatic for them, and it’s not the kind of material they are used to working with, which means their instincts don’t naturally kick-in. They have to build new skills. That’s the hardest part of doing these movies. We were lucky in that our filmmakers were quick studies, dove-in, and gave us the benefit of the doubt to create a process. They bought-in, seeing that their choices made in that space would be taken and not messed with too much. So, all in all, this is what I’m thinking about the whole project-long, and doing whatever I can to help them do their job, and have it all help me bring home the final results.”
In the hands of different filmmakers, Mufasa does have a different style than The Lion King. That meant Valdez and the visual effects team did need to adapt to the particular film language, camera placement and movement, and to the desires for the characters that Jenkins and his own team brought to the film. Jenkins and DOP James Laxton had done a few projects together, including Underground Railroad, If Beale Street Could Talk and Moonlight. “They had their own short-hand and process,” observes Valdez. “A lot of that involves discovery and responding ‘on the day’ to what may be happening on set. Despite CG being infinitely plastic in a way, we all know that it’s still a lot of human labor that makes it happen, and so pivots are not fast. The animation process has built-into it a revision system. You story board, you layout, you block animation and change camera, you cut it together, and often change entire scenes once the whole is up on its feet. This isn’t cheap, but it’s part of what allows the animation medium to produce very distilled, detailed and tight results.”
Audrey Ferrara, who had been MPC sets supervisor on The Lion King, also noted the particular style that Jenkins and Laxton brought to Mufasa. “Their filmmaking approach is highly distinctive: very kinetic, with extremely close character shots and often very long takes. This required us to build fully 360-degree sets and ensure our characters could withstand close-up cameras and handle shots sometimes exceeding 2,000 frames. This approach left minimal room for errors, presenting not only technical rendering challenges but also requiring different scheduling approaches. We devoted considerable time to preparation for these complex shots—anticipating potential issues, conducting tests, learning from failures, and continuously improving throughout the process.”

What Valdez and Ferrara found was that Jenkins and Laxton were not looking for ‘animated’ results; they were looking to create moods and character-based moments. “Their camera style is to be a kind of floating observer, hardly ever still, and often drifting between characters and getting very close to faces. Sometimes cuts are long, with the camera moving between compositions that might normally be handled in cuts.”
“All of this meant,” concludes Valdez, “that we had to figure out how to allow for some discovery. How do we provide content that can be shot this way digitally? Can we ultimately deliver the intimate character portrayals these guys do? The point is to capture their choices and not get in their way, and then of course the CG teams have to deliver on all of this. They had to actually fulfil those moments as if the final graphics were what the camera was responding to on set, as if the final performances were the takes the editor and director made in take selection. it’s all a massive game of retrofitting final work to an initial choice made as a bit of a leap of faith.
Based on how Jenkins and Laxton were approaching the film, then, the visual effects team sought to implement a number of targeted virtual production and VFX solutions. These were realized completely inside MPC and related principally to scouting in VR, shooting with virtual cinematography tools, motion capture and more closely intertwining Unreal Engine into the VFX pipeline. Unreal Engine also became the location in which all of Mufasa’s sets were first built, which Laxton lit prior to shooting.

“The miracle of Unreal Engine is a confluence of different features that combine to enable creative decisions,” starts Valdez, in how Epic Games’ game engine allowed for a couple of these virtual production workflows. “You have the real-time graphic potential and viewport. You have VR enabled. You have a non-destructive paradigm built in where the Sequencer allows you to bring in various ingredients, modify them and tweak them to create the concept of takes. And then they have a whole team working on what it means to do visual storytelling in a tool originally built to make softwares that are called games by most of us. In other words, because it can do games, it can also do other kinds of interactive things. That’s its fundamental function, in my book.”
MPC looked to further build creative and team routines on top of the interactive capabilities in Unreal Engine for Mufasa. “We asked, ‘What do we need to build ourselves to track changes and build workflows for us?’” recounts Valdez, who states that the guide for this was simply the filmmaking process. “First, we build big sets and decide where to work within them. So that’s scouting in VR. Stereo vision allows parts of your spatial-brain to kick in. You roam around and start mapping and understanding the space. You try lenses with ‘viewfinders’ and see where the good compositions are, and you decide where you stage the action, or tweak the set so it works best for the action and lens.”
The VR scouting process for Mufasa was, as St. John describes, somewhat like playing a game together—all from the comfort of COVID lockdowns. “We initially had everyone in the creative realm participating in a live game in Unreal Engine as we created the sets and laid out the world and journey we were building. We sent high-end gaming computers to everyone’s house across the globe, then we had a team of operators controlling the boxes remotely to ensure no contact happened between real humans.”

“From there,” adds St. John, “each user put their VR goggles on and could fly around the world we were creating and scout with each other in real time. Each user looked like a floating egg head in the game, but each user could pull up a camera, draw on a set in 3D, and measure things within the set to understand size and scale. While it took a moment to get used to, it is something we feel extraordinarily proud of. All sets were built in 3D and broken up to allow people to move objects, chess pieces and tools around in real time while others witnessed the creation of a collaborative effort. It was truly remarkable and safe for that time.”
In terms of what to film, MPC’s approach was to craft ‘master scenes’ of action. These could start with pure keyframed animation, or using motion capture which was something that was more directable. “I’d say we did a 50/50 ratio of those approaches,” says Valdez. “Either way, you end up with a scene, or a beat of a scene, that can be covered by cameras. This is hugely laborious if the work has to have all the tiny details that an operator can respond to, so that the cameras have real value long-term. But for a film like this, the animation team just slogs it out and works their ass off, responding to changes over and over until the scene seems essential dramatically.”
“The actual camera moments,” continues Valdez, “that is, taking those scenes and putting cameras to them, gets smaller as the DP, who did the operating here, would go in and spend his time on each scene creating coverage in their style. Sometimes even after all this, in the edit, it became evident some new animation piece might be required to fulfil a certain cut they wanted, or change to the scene that needed some glue.”

The film, as noted, has such an intended live-action feel. Rather than feel limited by the technology, Valdez says the filmmakers embraced a moving camera, partly because this was their style anyway. In fact, points out Valdez, “I think a lot of a live-action feel lives and dies in the nature of the camera’s movement. Here, we really tried not to smooth it or take any of the hand-held out of it. And it’s almost always on the move. Here, also, lighting marries to camera. These guys don’t move lights per angle all that much. They kind of like a naturalistic, perhaps ‘verite’ style. So the hand-held camera, the sense that an operator is responding to life in front of the lens, and that this life is happening in a magic moment; that’s a lot of what especially character-emotion moments were all about.”
“Now, of course,” further notes Valdez, “those scenes would have been in production over a year between stage capture and final animation. So, to preserve that raw state and embellish it is just a huge effort from all the layers of supervision, the filmmakers and editorial process. It’s not a very technical endeavour, it’s just artistic intent and hard work and a relentless review process trying to make sure we keep this quality in the final frames.”
Throughout the whole virtual production ‘shoot’, thousands of on-stage takes were shot using MPC’s virtual camera, or VCam, and the motion capture systems. What was produced was both a recorded and a rendered artifact. MPC sought to track per-take data for the entire show. A further step, as part of this push to be as interactive as possible, was to allow animation to be rendered back into Unreal Engine as the shots continued to be refined (final shots were rendered in RenderMan). This Unreal Engine workflow was handled via an extension of MPC’s virtual production platform Genesis.
Celebrate MPC artists with this special magazine issue on ‘Mufasa’
“Data tracking is a very dry thing, not super sexy from the outside, but it enables so much,” advises Valdez. “We do something like 30,000 takes on a show like this. And we never know which will be selected, or which might be loaded back into the shooting stage to go again, potentially weeks later. So, being able to track the movement of objects, the changes to animation timelines, the positions of lights, or the change of a sky for a new time of day is vital.”
“And then later once the edit locks in and those takes must travel to the final stage, the take data must match,” notes Valdez. “In this case, we went one step further. Once animation became near final, we’d bring the animation and potentially tweaked camera back into Unreal Engine so that we could make dailies that would cut back into the main cut that might still have a majority of stage takes.”
That process of bringing animation, done in Maya, back into Unreal Engine was dubbed SWFT, headed by MPC senior architect Callum James. “Our newly developed export tool streamlined the delivery of selected takes directly to MPC departments while formatting content specifically for our pipeline, ensuring complete preservation of creative decisions throughout the turnover process,” outlines Ferrara. “We also made the strategic choice to present primary animation from Unreal Engine, maintaining visual consistency with editorial’s cut.”

“Our workflow emphasized capturing comprehensive detail with immediate pipeline integration,” continues Ferrara. “Takes were instantly rendered, reviewed in dailies, and automatically distributed using Movie Render Queue, FBX and USD workflows. This infrastructure provided immediate access to all cameras and performances—offering either reference material or, for camera work, the exact setups required for downstream production stages. We further extended these tools and platforms into post-production, allowing animators to utilize the precise cameras and environments from the original shoot during their iteration and review cycles. This continuity between production phases preserved both creative vision and technical accuracy throughout the project. Unreal Engine ultimately emerged as an indispensable cornerstone in the film’s production methodology.
Valdez believes the benefit of this approach was the ability for all to see where a particular shot or scene was up to. “It’s always been a thing that editors don’t like to cut rough work into their main cut, because it bumps the watch. Also, it may just be hard to live with, to share, to tweak cuts to. So here we wanted to make sure the cut was always fully up to date for editor, director, any studio discussions—anything. It means the director is living day in and day out with the real work, not old stuff. The sooner you can bump old/temp stuff out of the cut the better. Because ‘temp love’ sets in and everyone just loves the old.”
“I’ll confess, though,” admits Valdez, “that sometimes old temps are the best version of a shot. We can break shots in the process, that is, lose what made them work as the many hands of post get involved. So this is all a balancing act like everything else.”

As an extension of the virtual production R&D being done by MPC, a three-minute trial sequence was concocted for Mufasa that would be an early test for the animation of the main characters. “The three minute trial sequence was used as a way to familiarize Barry and James with virtual production and also for the stage and animation teams to iron out workflows,” discusses Daniel Fotheringham. “The animation department needed to deliver a three-minute animated performance consisting of our four main characters and an additional eight pride lions. This piece was animated in Maya then ingested into Unreal Engine where James could virtually explore the scene from any angle.”
“Creatively,” continues Fotheringham, “the animation team started relatively traditionally, following a storyboard animatic provided by editorial. Then, over a series of presentations on Zoom with Barry we developed the piece until it was ready to be ingested into Unreal Engine. These presentations consisted of showing Barry multiple viewing angles of the master scene we were creating. Usually hero cameras rendered out of Maya which followed the main characters and additionally wide witness cameras of the scene. This process was eye opening for all of us and revealed Barry’s very strong preference and skill for intricate stage choreography. Based on this we needed to develop a much more organic and flexible approach, which animation isn’t well known for. The test spurred dozens of tools and workflow advancements, including QuadCap that we would continue to develop over the rest of the production.”
Out of the virtual production process, MPC was able to make what Ferrara refers to as ‘premium previs.’ “We dedicated extensive time to previsualization, ensuring it was highly detailed and captured the filmmakers’ entire creative process. The animation was elevated to near-blocking stage quality, while lighting was meticulously curated with specific choices for skies, time of day, and weather conditions.
“Within Unreal Engine, a total of 12680 on-stage takes were shot using the VCam and motion capture systems,” adds Ferrara, “all of which produced both a recorded and rendered artifact. Sets were crafted with precision, featuring highly detailed models and textures. This approach provided not only a more visually appealing experience for Barry Jenkins and Joi McMillon, the editor, but also a more accurate initial version of the movie. This comprehensive previsualization allowed the studio to visualize the film’s direction more quickly than usual. The previs ultimately became an even more essential reference point—a true bible for the production—to which we consistently returned to throughout the filmmaking process.”
Importantly, too, the decisions made in the ‘virtual’ stage of production were preserved and referenced right throughout the whole process, including when MPC began tackling lighting. “In order to guarantee lighting continuity between previs and post-production, we asked James Laxton to brief directly our lighting supervisor Francois de Villiers,” outlines Ferrara. “This direct line of communication gave Francois the insight on James’ process and how he set up each scene, and what type of problems he might have faced during the shoot. Francois could also ask directly any kind of questions he had and have a better understanding of James’ taste and choices. This also helped Francois organize the lighting workflow in a way better suited for the show.”







