MPC VFX SUPE Nick Davis breaks down the techniques.
For Thea Sharrock’s The One and Only Ivan, the filmmakers needed to tell the story of a group of animals—a gorilla, elephants, dogs, a sea lion, a chicken, a rabbit and a macaw—living in captivity in a shopping mall, and also show them in fully CG jungle environments for several sequences.
Scenes with live-action actors and sets followed a more traditional VFX workflow, while scenes where the animals and their environments would be essentially fully synthetic relied on virtual production techniques spearheaded by MPC’s Genesis Virtual Production team. Interestingly, this included for fully CG shots of the primary ‘Backstage’ live-action set, which was re-created as a digital model, thus enabling filmed footage and full CG shots to be cut back-to-back.
In this befores & afters breakdown, production and MPC visual effects supervisor Nick Davis elaborates, step-by-step, on how virtual production came into play, including via initial mocap, game engine rendering and virtual cinematography. He also discusses the on-set VFX techniques.
1. The rationale for using virtual production techniques in the first place
Nick Davis: The movie split itself quite nicely into a 50-50 between virtual and live action, when we broke the script down. Right from the get-go, we thought to ourselves, ‘Look, what we don’t want to do is build all these sets for the live-action scenes and then go in there and just shoot empty plates for the virtual scenes,’ because we had pages and pages where it was the animals talking to each other at night in their own cages.
We knew we had to build these same sets for the practical sequences when we had actors. But it just seemed like a very old fashioned and limiting experience for the director and the filmmakers to go in there and follow some sort of previs and say, ‘Okay, well, we’ll shoot a wide. And then we’ll say that the dog’s over here. And it’ll move over there.’
Having done a lot of work like that in the past, you run very quickly into the problems of, well, what are we following? How do we know the real path that the animal’s going to take? Where are we going to put the focus? It just becomes a mind-numbing experience for everybody to shoot empty plates that you just really don’t actually know what’s going to go on. So we very quickly came to the conclusion that we wanted to do it as a virtual camera experience. We felt this would much more put the control of the sequences, the virtual sequences, back into the hands of the director and the DP and the camera crew.
2. Planning a virtual production shoot: the first stages
We worked with the production designer, Molly Hughes, to mark out the sets on a rehearsal stage. We had a gorilla actor who was a motion capture performance actor to play Ivan. And we got some puppeteers to play Bob the dog, and Ruby and Stella, the elephants. Here, Thea started blocking out their movements.
Once Thea had figured out those sequences in her head and blocked them out and rehearsed them, we then transposed that into a motion capture environment. It was here that we blocked out the sets very accurately. We needed to know where the tree was, where the water holes were, where the rocks were. We built them as motion capture sets with all the relevant changes in heights and props and pieces.
Just prior to that we also did voice recordings with the voice artists, Sam Rockwell, Angelina Jolie, Danny DeVito and everyone else. That gave us a good edit of the voice recordings. And then our performance actors had learned the lines and were able to perform the lines while listening to the performance of the voice actors.
3. The mocap shoot
We spent two weeks on motion capture stages capturing Ivan’s performance, with the puppeteers giving him the eyelines and the correct positions. At the end of it we then had what we called ‘master clips’ of each of the little sequences.
Once we’d recorded the master clips, we then passed those onto MPC, for their animation team. They then transposed and turned the puppeteers into the animated characters. And they took Ivan’s motion capture performance and cleaned it up and made it work as a character. We came back to Thea and started to show her the master clips with all the characters now animated, also within a virtual environment, which we’d put into Unity.
We popped a few cameras on it so Thea could see it from some different angles. And then as she basically would final the master clips, the basic animation, we would then start stockpiling them.
All of this was a concurrent process while we were actually shooting the live-action portion of the film (see below for more on this). Which means during live-action shooting, every single lunch break or every evening we would be showing her these animated master clips, until we built up a huge library.
4. The virtual camera shoot
At Pinewood, we went onto a virtual camera set, where we had the DP, Florian Ballhaus, the dolly grip, crane grip, focus pullers etc. We had all of the same crew who were from the live-action shoot come onto the virtual stage. We had dollies, we had cranes, we had geared heads, we had fluid heads. We had handheld cameras, we had Steadicam. We had all the same tools that we’d used on the live-action. Except now, rather than pointing them at real sets and real actors, basically the tools were now driving virtual cameras that were filming these virtual master scene clips that we had.
We could put the DP, and the director into VR headsets and they could then walk around the set, they could see the characters performing from the master clips in these virtual environments. And then they could block it out and they could go, ‘Okay well, we’ll do a master shot from up here. We’ll do an over from here. We’ll do a two shot. We’ll do a single.’ It was all the same coverage that they would have done had it been on a real set.
We were also able to light the scenes in Unity. Florian could say, ‘Well, we know the key light’s here. We know we’re going to have this amount of fill. We’ll put a few little rim lights replicating lights coming in from X, Y, and Z.’ It became quite an intuitive process whereby everybody was sitting at monitors doing what they would have done on a normal film set, except there’s just nothing there. So, it does become a slightly surreal experience.
On this virtual camera set, we spent eight weeks filming all of the master clips that we had assembled. At the end of the day we hada whole series of dailies and they would go off to editorial. We’d send them into a post render process as well where we would improve the quality of them. We’d add volumetrics and additional lighting and we’d make them look a little bit more filmic. Then editorial started cutting the sequences to check that we’d got enough coverage and that the sequence was there. Eventually this was turned over to MPC to take to the final stage.
We used the Genesis tools that MPC and Technicolor have been building, and came out of their experience with The Lion King. For Ivan, I wanted to keep the simplicity of real-world tools. I just wanted to be able to say to the guys, ‘There’s a crane, there’s a dolly, there’s a Steadicam. You tell us where you want to be, and we’ll put you there in the world. Of course we can change the parameters, and we can give you a track that, as you push it, it’ll push 10 times the speed that you push it if that’s what you want. But why don’t we just keep them as real world one-to-ones? Let’s keep it really simple and utilitarian.’ And actually that’s what the crew wanted, too.
I mean, we could have done the whole thing with just previs and postvis, and you could have sat there with animators and artists working on their computers. But I think what this approach did was bring the same shooting style and the same organic feeling that we had in the practical photography into the virtual photography. And therefore I hope that the viewing experience is not two different movies, but is one seamless movie that worked together.
Still, we absolutely previs’d large chunks of the movie with external companies. All of the flashback sequences we previs’d, and the more action-oriented escape sequences, too.
5. How the live-action portion of the shoot worked
For the mall and the whole behind the scenes set area where the animals live and spend most of their time, that was filmed in a giant set on the 007 Stage at Pinewood Film Studios in London. We shot there for about six or seven weeks with the actors. Here, Ivan was played by the same performance capture artist who played him in virtual production, and in the rehearsals. He now wore an on-set motion capture costume that gave us real-time motion capture data, which was great. We used Xsens suits. That gave the animators somewhere to start from. But of course it also gave our actors eyelines and something to perform to.
Then for all the other characters we had puppeteers on the stage. We had giant puppets for Stella. We had little handheld puppets for Bob and little rod rigs to move Bob around. And we had a little Ruby that we could just wheel around and be in the right position to be interacted with. We also had green versions of them and we had more photorealistic versions of them so that we could see them in their lighting conditions and just see how they behaved in the set lighting.
We scouted Florida to find the exterior of the mall. And then at the same time, luckily enough, we found this place about sort of 50 miles from the mall that we used for where Ivan ends up going at the end of the movie. It was such a good jungle likeness that I also used it for the flashback sequences as the base canvas – we obviously augmented it and added a lots and lots of more jungle-ly feeling that didn’t exist there, but it gave us a fantastic sort of starting point, and meant that we didn’t have to build entire virtual forests, although we built an awful lot. We took our mocap performance artist for Ivan as well, so that he could give us the whole sequence when he’s first released and he walks out onto the grass and takes the flower.