How exactly does Wētā FX take on set performance capture and translate it into living, breathing apes?

May 12, 2024

Find out in this in-depth VFX look behind the scenes of ‘Kingdom of the Planet of the Apes’, which also covers Wētā FX’s advancements in capture tools, facial tools, crafting that incredible river moment, Raka the orangutan, a new feather tool for the eagles, and the art of making apes ride horses (yes, it’s really, really hard).

Wes Ball’s Kingdom of the Planet of the Apes once again relies on the fine art of Wētā FX to translate actors in performance capture suits and head-mounted cameras into 100% CG creatures.

The visual effects studio has, of course, a mountain of experience from the previous Apes films, and the Avatar series.

On Kingdom, Wētā FX faced new production challenges, such as swapping British Columbia shooting locations with the harsh sunlight of Australia, crafting apes with a lot more dialogue this time around, as well as having to deliver several sequences involving complex ape animation in complex water simulations.

With visual effects supervisor Erik Winquist and animation supervisor Paul Story, befores & afters goes deep on the principal challenges of Kingdom, breaking down how Wētā FX delivered more than 1,500 visual effects shots for the film (only 38 shots in the entire movie are without VFX, with a stunning 33 minutes being entirely digital).

From performance capture to final performance

Since the first Apes film, Rise of the Planet of the Apes, Wētā FX has implemented a system of the actors wearing performance capture suits fitted with active markers, that is, markers that use powered LEDs to emit infrared light, and an array of wireless motion capture cameras. The combination of the two enabled large outdoor capture to be possible in fairly rugged conditions.

“For Kingdom,” outlines Erik Winquist, “we were at our third generation of active suits where the strands on the suits, those siliconized rubber housings for the actual markers are actually now embedded within the suits. That means there isn’t anything to snag on branches or limbs of other performers.”

In terms of facial capture, the past Apes films had typically relied upon a single helmet-mounted capture camera looking back at the actor’s face. “Now we’re using a pair of stacked stereo cameras that allow us to then reconstruct a 3D depth mesh at 48 frames a second,” says Winquist. “That helps inform us of all the really subtle nuances of the little lip movements or anything that an actor may have done with their face that just then flows beautifully into our Deep Learning Facial Solver.”

The deep learning solver (originally pioneered for Gemini Man) was used to go directly from capture to a simplified digital model of the actor’s face. “We used to capture to our digital model to give us something that we could QC against the original performance,” discusses Paul Story. “So, there we’d have something that we would capture against, QC that, maybe tweak that with motion to work out if those keys and muscles were working properly, and then transfer that over to the character puppet. But that previous way was quite a long process, so what we had developed on this film was a way to set up the solver with the digital model, but when the animator got to it, they could really capture or solve straight to the character puppet. That just gave them more time to be able to really fine tune the performances, looking at the original performance and making sure the lip sync was working, the eye shapes were working and the emotions were reading just as it was intended.”

An additional part of the capture process involved earlier preparatory work with the actors in terms of scanning and recording face shapes, as well as the use of witness cameras on set and, obviously, relying on what is captured with the principal film cameras. Some additional performance capture also took place on a controlled motion capture stage.

To take the performance capture through to final animation, body capture and facial capture were tracked and solved via various toolsets at the studio. As Story explains, a motion editing team was responsible for mapping capture to the character, with each character having a specific ‘map’. “The motion editors prep that motion to just get the base performance, making sure that it’s all tying in with time code and making sure that’s all lining up with the facial capture we’re seeing.”

One aspect the motion editors were particularly responsible for was an initial re-targeting of limb sizes. That’s because a very deliberate decision was made on Kingdom to create taller apes with longer legs. Partly, this was driven by a taller set of actors. “A lot of our actors were six feet or above,” observes Winquist. “Of course, however they framed the actors, where their eyes were in frame, that’s where their face should be in frame. But if we had a six feet tall actor, like Owen Teague for Noa, we had to make Noa have subtly longer legs, otherwise we were going to be having a bunch of floating characters in order to fit the composition.”

“The justification I was using for myself was, well, we’re on an evolutionary trajectory from those chimps that we saw in the jungle at the very beginning of Rise of the Planet of the Apes that were normal-sized chimps to the 1968 Charlton Heston movie where we’ve got these very upright actors in suits. So why don’t we just fast-forward our evolutionary process here and just say that we’re heading a little closer to human proportions.”

It turns out the decision to feature longer-legged apes would also help in ‘selling’ a significant number of horse riding shots. “It’s always been a challenge, every time we’ve ever put a gorilla on the back of a horse, it always looks ridiculous,” comments Winquist. You get a lot of mass up top and these dinky little legs that have to have these hiked-up stirrups. So we made our gorillas and our chimps have longer legs.”

From motion editing, a blocking pass was then sent to editorial. “That will just be the body motion, making sure head angles are roughly in the right position and compositions are as they were shot,” notes Story. “If it’s something where we’re putting cameras on, we also explore different camera options that would work in a shot.”

“We would then send a WIP pass,” continues Story, which is about getting our facial together with the body and presenting that, making sure that the facial and everything’s tying together, and then it’s a detailed pass on both for final.”

Winquist adds that he was impressed by developments in Wētā FX’s Deep Learning Facial Solver that the initial blocking pass ultimately turned into the blocking and WIP facial pass at the same time. “Instead of the characters just staring into space, there was actually a performance there that you could start analyzing. We would start getting facial animation notes on a blocking pass, which is pretty great to jumpstart the whole process.”

While the deep learning solve proved extremely useful, there is (and always has been) a significant keyframe animation side to Wētā FX’s creature work. “A lot of the characters are full keyframe facial animated,” states Story. “There’s been a lot of animation time put into some of the less hero characters and even the hero characters, where sometimes the facial rig wouldn’t do what it was supposed to, or the characters are too close together, or we might have had to take one of the cameras off the actors. The rigs for our apes were great in that we could do both Deep Learning Facial Solver solving and full keyframe motion to them as well.”

As far as particular animation challenges were concerned, Story identifies the much larger amount of dialogue required this time around for the ape characters. “We had to figure out how we could articulate that in a way that just felt more natural. Obviously, a lot of that comes from the performances that we had from our actors. But lip sync is always tricky, whether it’s human or otherwise. For the apes, we had to dial in those ranges that you have from the human to the ape performer and making it feel naturalistic, while keeping an animal-esque feel to them as well.”

On set capture of everything else, VFX-wise

The performance capture of the actors is, clearly, central to the creature work in the film. But just like any live-action visual effects project, a whole range of additional surveying, data capture, HDRIs and other on set work was crucial to building up the necessary data artists need to properly integrate the CG apes into plates. “Stuffies” made by Wētā Workshop of Noa and Raka were also built for on set lighting reference.

Director Wes Ball.

Wētā FX follows a bespoke process with a set of tools collectively called PhysLight for going from on set capture to final render in its proprietary Manuka renderer (an interesting stat from the VFX studio is that Kingdom of the Planet of the Apes required 946,000,000 thread hours to render between the on-site render wall and cloud rendering). The idea with PhysLight is to capture and measure the full dynamic range of the sun and lighting setups in each scene, which then feeds into a suite of lighting tools that work with real-world units and terminology. The one addition to that for Kingdom, says Winquist, was lenses. “We’ve never previously measured what the lenses are doing to the light. We’ve been replicating the spectral emission from our light sources. We’ve been replicating the spectral sensitivities of the virtual cameras in our renders with Manuka and all of our textures are getting up sampled into full spectral for light transport, but we’ve never been accounting for what’s happening in between the sensor and the subject, and that’s the lens.”

“I built some hardware and took it with me to Sydney and went to Panavision and spent the afternoon there gloriously measuring these amazing lenses to determine things like, ‘the C-series 35 is actually slightly yellowy green’ and ‘the V-series is a little bit more neutral.’ We wrapped that all up into a package that, when we validated the HDRIs this time, it was essentially taking into account that the lens was slightly yellow green or slightly warm or slightly cool, and just getting us that much closer to a match to what was there on the day.”

Another consideration on set was how to capture clean plates. “At the beginning of prep,” recounts Winquist, “I had put together an hour-long presentation going through shots from the previous three films, saying, ‘Let’s talk about how we captured these and why we shot what we shot and why we did it this way just to really quickly get everybody’s heads around what the process is,’ which was really helpful. But the main thing that came out of that was, the clean plate is the plate. We need to shoot the actors doing their thing, yes, but then they need to get out of there and then we need to shoot a clean plate because that’s what we want to put in the movie.”

“The thing is, Wes and his DP Gyula Pados, they’ve got this great shorthand. They’re all about a very run-and-gun style of shooting. I just needed to make sure we had all this other stuff, and the clean plate was what we wanted to use in the movie. To his credit, Wes has been phenomenal to work with on this movie. He understood the point of it, and we used that clean plate approach as much as we could in the film. There were cases where the performance plate was 200% better than the clean plate. Wes is so attuned to camera, he’s got an amazing eye for camera movement and just the subtlest little nuance of the way that it moved this way instead of that way in the clean take, he’d see that, and we’d make it work.”

Some clean plates had the benefit of a simul-cam set-up and depth data to help compose scenes. These related mostly to interior stage builds, such as the eagle nesting area and Proximus’ chambers for the dinner party conversation. “Having a simul-cam allowed Ryan Weisen, the A camera operator, to shoot a good clean plate,” says Winquist. “That clean plate got fed into Unreal for a live comp with a matte, mapping the preferred take that Wes had from, say, take five, of the performers. We could essentially map that onto one of the actor’s puppets in real time, comp that over the live footage from the camera and feed it back via Q-take to an iPhone that Ryan had on his Alexa Mini LF camera, and so he could see both what he was shooting in his main viewfinder but then also see a preview of how that was going to actually frame the actor, and then make his framing choices appropriately for the ghost of the character that was no longer standing in front of him.”

Raka started with a single photograph

The orangutan Raka (Peter Macon) is a stand-out character in the film. He not only delivers scores of wise lines of dialogue, but also features in one of Kingdom’s high-octane scenes that required meticulous water simulations to interact with Raka’s fur. For Wētā FX, Raka presented an immediate challenge in terms of design.

“We spent a long time in the design process, partly because the only thing we had to go on for Raka in any kind of meaningful context was a photograph of a face that Wes absolutely loved,” recalls Winquist. “Through the process of a reverse image search, I actually discovered who that orangutan is, and it turned out it is in a zoo in Wuppertal, Germany where the sister of one of our VFX supervisors, Phillip Leonhardt lives. So Phil’s brother-in-law went to the zoo and got us all these photographs of that particular animal from off-axis so we could actually see its shape. Orangutan’s have got such a weird-shaped head and the way all the scraggly hair that you see hanging over his forehead is like a big comb over. It was getting that reference that really unlocked Raka for us. It was awesome.”

Raka delivers many funny and also thoughtful and meaningful lines, with Story noting that Wes looked to Wētā FX to amplify certain actions. “His muzzle is a lot bigger than the other characters. We had to find that balance, that key range for his dialogue, especially, for us to convey his motion, making sure that we are matching what the actor was doing, and making sure his eye shapes are closer to what Peter’s were. One thing was, Peter performed with his head quite upright and we constantly found that instead we had to hunch Raka over a little bit so that his shoulders felt a little bit higher, to amplify him.”

“There was that moment just before they spot Mae (Freya Allan) taking the blanket off the horse and they chase after each other and he just has that little exchange with Noa, saying ‘I can teach you.’ Ultimately it was Peter’s performance and we were really just trying to match and keep it within character of the actual design.”

For Raka’s fur–which is the longest of all the creatures in the film–Wētā FX relied on its Loki framework for simulating hair and coupling its movement to other elements such as wind or water. “So many of the plates that we had from wherever we were around Australia, we’d get a wind kicking up and you just get all that nice fluttering in the leaves and you need to then have some evidence of that in the fur,” outlines Winquist. “Our Loki solver for all the fur dynamics on this was feeding through into everything.”

“There was also the big breakthrough on War of the Planet of the Apes,” continues Winquist, “with the shading model for the hair that started to simulate the actual components of what’s in animal hair that doesn’t exist in human hair–the medulla, the cortex, for example, plus how that hair behaves in sunlight when it gets back lit. We were able to capitalize on all of that work and roll that into his look. What was great here, too, was that we have such a range of lighting environments to work with. So much of the previous trilogy was gloomy or dark and overcast, with really soft flat lighting, whereas in this film we’ve got the blazing Aussie sunshine to contend with. It threw down lots of interesting challenges, and the lighting team just really came through.”

The river and 1.2 petabytes of disk space

Raka, Noa and Mae are confronted by Proximus’ muscle at a river crossing. The FX simulations in that sequence would require 1.2 petabytes (1.2 billion MB) of disk space. To build up that scene, animation started with the original performance capture, some of which was carried out in partial sets with flowing water.

“Our work was a lot of back and forth with what shots retained plate elements for the water and what would ultimately be CG,” relates Story. “We would be trying to keep as much of what the plate performance was there.”

“A great example of that is a close-up shot where Raka pushes Mae up out of the water,” advises Winquist. “That’s the performance take. Special effects provided us with a small river tank that had this current flowing that they could control. Freya and Peter are there in that river flow. We were able to use Freya, paint out Peter, and replace him with CG, of course, but the water that was actually pushing up against Peter’s chest in his wetsuit that he was wearing is in the movie. It was great to actually take advantage of the plate water.”

Although Wētā FX had gone through a major R&D phase for The Way of Water to develop its Loki state machine for coupled fluid simulations, the river sequence in Kingdom presented some different challenges, in particular, that the water was of a nasty sediment-filled and dirt and debris type, with much surface foam. “For the purposes of efficiency and flexibility, we were leaning less on the state machine approach and bringing back in some of the older ways of working on water sims,” notes Winquist. “One thing we’d do is run a primary sim at low-res first that would give Paul and his team a really low-res mesh that they could at least animate to. So for example, they’d translate Peter as Raka, who was sitting in an office chair getting pulled around on a mocap stage, and work that into a very low-res sim. In the meantime, the FX team would direct the flow and get a rough version of that in front of Wes.”

This essentially involved ‘art directing’ the current and camera, specifies Winquist. “Does the camera dip under for a moment and come back over? What does that mean for having to play water sheeting down the lens? Once we had that art directed river in low-res, animation could go off and start animating apes against that current. Then also our FX team could go in and start looking at up-resing that into a much higher resolution sim.”

The next steps were a back and forth of animation animated to a low-res mesh. The benefit is that the animation done using the low-res mesh matches well to and integrates with the subsequent high-res fluid simulations, although tweaking is always required. Once these steps occur, the creatures team would take the flow fields of the simulation to affect the hair of the apes.

“For that we’re using Loki for the hair of the creatures and water to all interact,” says Winquist. “Then we take the creature bakes, bring them back into the sim, and then FX has to go in and do a super high-resolution, thin-film simulation against the hairs, because now we need to make sure that we’re taking into account volume preservation of the water. If they jump out of the water, we also need to show that the water is now starting to drain out of their hair.”

The art of apes on horses

Apes on horses feature heavily in the film. Accomplishing those complex shots was done in several ways. One consideration for Winquist was not having to do digital horses for every shot. “At the same time, however, some of the things that we were needing to do were completely impractical or weren’t going to be possible for a horse team to pull off, at least not without massive wire white taped guardrails all over the set that we’d have to paint out. So it’s a mix of shots where the horses are real and we needed to match move for the saddle position, and others where they’re digital.”

One example of a complex horse shot was the three teenagers, Noa, Soona and Anaya, riding into the village after they have finished their climb sequence in search of eagle eggs. Explains Winquist: “That’s just three of our horse riders, not our actors, riding horses into the village set. We match moved the horses for where the back was for a solid lock on the saddle, so that animation knew where to put their CG apes. Then we take the Noa, Soona and Anaya actors and place them on the back of a half round or a barrel being pushed through space by some grips. Paul would ask the actors in those pickup sessions to not try and mimic riding a horse but to just act the scene out and we’ll handle the physics of all that stuff.”

The film’s horse master was Graham Ware Jr., who just happened to be involved in the horse riding of the Lord of the Rings films. “I told Graham that we actually still used horse mocap from the Rings films that he had helped us with!” exclaims Winquist. “We’re still dipping into library motion for horses from way back when, although there’d also been a big R&D push way back on Abraham Lincoln: Vampire Hunter to study horse musculature and anatomy in and out that we leap off from.”

Winquist also praises the paint and roto teams for work done on plate preparation for horse riding (such as painting out the riders and performance capture suits). “Those departments have been forever the unsung heroes of this whole thing. If we can’t paint out the rigs or anything else, the whole facade crumbles. We certainly kept them busy on this. In fact, paint and roto were so critical for us on this show because we probably used a higher percentage of performance plates on this movie than we did on any of the others.”

Eagles, and Wētā FX’s new Apteryx feather tool

Alongside the CG apes, digital eagles are also stars of the film, in particular, golden eagles. These were not present in Australia for filming, so Wētā FX looked to other suitable reference, as Winquist details. “I was looking into it at the beginning to see if I could find some taxidermy on eBay of a golden eagle and could the production just buy it and ship it down to Australia. But golden eagles are a protected species and it gets into this whole pile of red tape of importation of animal remains. Then I wondered if we should have Wētā Workshop make us a feathered lighting reference, but that came down to the cost and the fact it might have been trashed within a week since it would be quite delicate. So we had them give us a very basic 3D printed, plastic painted, stuffy, not for feather lighting reference, but for basic framing and lighting direction.”

What also helped was a bird-handler in Australia who had a wedge-tailed eagle and could demonstrate to the production the falconry with the bird. “The actors could all put the gauntlet on and actually feel what it was like to have a four kilogram eagle on your arm and actually have it take off and have it land,” says Winquist. “It was fantastic reference for us. I was out there with the camera shooting 120 frames a second, just getting this footage that we could pore over in post, which really helped inform, say, the way an eagle’s wings flare coming into land on somebody’s arm, the way that the feathers ruffle from the air turbulence over the leading edge of the wings and all of that kind of stuff.”

Story mentions that a previous Wētā FX project, Peacemaker, which featured a bald eagle called Eagly, also helped inform the bird work on Kingdom. “We had some good base motions there. We just kept adding and transferring that to the rig that we had so that all proportions were working for this film. The animator would go through and pick key component timings and actions that would work for that shot that we could then blend to and from with our library motion. It had to be based on timing and action and little head turns and things like that to keep it as real as we can.”

In terms of golden eagle feathers, Wētā FX had begun the project with its existing plumage tools for grooming, but by the end of Kingdom had developed a new feather tool called Apteryx. “The feature set had become rich enough that the models team actually went back and reworked the groom for some parts of the eagle done with the previous tool,” describes Winquist. “Some of the close-ups that you see had the benefit of the new grooming toolset that just gave them more control.”

So many VFX challenges

While CG apes were the primary challenge on the film for Wētā FX, there were countless other VFX challenges that the studio had to contend with. One was the vast jungle and overgrown city environments, and another that Winquist points out was fire, including for the village attack.

“One thing I’m hugely proud of in the movie was the fire work in the village attack. I thought our FX team (effects supervisor Claude Schitter) just smashed it. There’s a 90 second oner in that scene of continuous animation and the entire village chaos which is amazing. Between that and the torches in the film–there’s so much torch work that is just so hero and close-up–I’m just really proud of where they got to.”

For Story, too, some of the subtle animation of the apes is work that he is most fond of, such as close-up on the hands of the creatures as they are flicking through books for a couple of different scenes.

“With those detail shots, obviously capture can only go so far. You get a base on performance, but when you get detailed things like that, you have to match move pages, you have to match move fingers. It’s a matter of aligning those specific joints to the match move and just finessing,” says Story.

“Just so much finessing,” marvels Winquist.

 

Leave a Reply

Don't Miss

We got to sit down with Wētā FX for 30mins to talk about the VFX of ‘Superman’

A new video interview on how the Kaiju battle was

The new ‘Fire and Ash’ featurette includes amazing Varang breakdowns

How Oona Chaplin performed, and how Wētā FX crafted the

‘Fire & Ash’ VFX supe Sam Cole on some Varang flame thrower fire effects

Wētā FX has started a fun series of 'fave shots'

Discover more from befores & afters

Subscribe now to keep reading and get access to the full archive.

Continue reading