‘It was literally: shoot, here’s the plate, make it work, go’

January 12, 2024
Ken Watanabe as Harun in 20th Century Studios' THE CREATOR. Photo courtesy of 20th Century Studios. © 2023 20th Century Studios. All Rights Reserved.

Behind the slightly old-school VFX approach on ‘The Creator’.

When Gareth Edwards’ The Creator was released this past September, it was accompanied by commentary from the filmmakers about a unique approach to the visual effects.

That approach was, shooting the film first and then working on world and character designs and visual effects only later. A particular example of this saw Edwards shoot certain scenes in the centerpiece floating village attack with live-action actors, and then only in the edit deciding which of these actors might be turned into robots by the visual effects team, led by Industrial Light & Magic.

If you think this sounds a little like the way VFX used to be achieved–before the now common heavy reliance on previs, on-set surveying and measurement–then you’re right. However, it was a deliberate approach from Edwards and his team, designed to bring a very naturalistic feel to the final film.

Ultimately, it turned to Industrial Light & Magic and its partners of VFX vendors to bring the world and characters of The Creator to life with a much more ‘traditional’ brute-force approach to matchmoving, paint and rotoscoping, particularly for the film’s robots and simulants.

That’s something befores & afters talked to ILM visual effects supervisors Charmaine Chan and Ian Comley about at the recent VIEW Conference in Turin, where they showed some stunning breakdowns of the work (on the film, Julian Levi was visual effects producer, Jay Cooper was ILM visual effects supervisor, and Andrew Roberts (ILM) was on-set VFX supervisor).

Here’s our conversation about this unique approach to filmmaking, which also discusses early tests ILM carried out, just what went into the roto/paint/matchmove/matchanim and comp for robots and simulants, a special robot kit made up in Nuke, crafting the environments, and the use of ILM StageCraft in the final act.

b&a: In your presentation, you showed some early tests that were done for the film. Were you involved with those yourselves?

Ian Comley: No, those were overseen by my colleague, John Knoll.

b&a: Of course it was. I really liked what you showed, which I think included a rice field and farmers out there, augmented with futuristic looking buildings. It was amazing how much that augmentation ‘changed’ the look and feel of the original plate, but also you could still ‘feel’ the original.

Charmaine Chan: Yes, it was so great. Just from those structures it was like, ‘Yep, I’m in a completely different time period.’ It just took you away completely. Even without it, it’s already a beautiful shot, and so I think it was really smart for Gareth to just go and shoot that footage.

b&a: You also showed a test of a man on a motorbike transformed into a robot, and a man smoking on a boat being transformed into a simulant.

Charmaine Chan: Yes, I believe that man on the boat just happened to be there. The camera rolled, the guy looks over and wondered, ‘Who are these people filming me?’ It was so natural.

Ian Comley: You really get the reveal with that test because you get that moment as the guy turns away and you see the profile and all the negative space inside his head. It’s beautiful.

Charmaine Chan: It was also all about not over-directing people, it’s just letting them naturally do what they would do.

b&a: Yes, in terms of that naturalistic side of things, this film seems like more of an ‘old-school’ visual effects project, where you had to do a lot of traditional match-moving, paint and roto, without the use of a lot of greenscreen, set scanning and surveying. The results are amazing, but what was it like working on something that had that approach?

Charmaine Chan: Yes, well, approaching this project, we knew that the scope and scale was huge, but what we didn’t realize was that Gareth was adamant about doing this as simply and quickly as possible. What that meant was that we weren’t looking at doing our usual complex setups of getting HDRIs or getting all the LiDAR data. It was literally: ‘shoot, here’s the plate, make it work. Go.’

And I think we all responded to that very well, because we haven’t done that to this degree in a while. It was about going back to our roots of visual effects where we’re augmenting things that already exist. Even though what we’re creating was not something that exists in the real world, it’s still grounded to a reality because of that.

b&a: You felt the same way, Ian?

Ian Comley: Oh yes, one hundred percent. Before that, I was working on ABBA Voyage–which I loved–but then coming on The Creator, we got plates, beautiful plates! And at 24fps. I think so many artists who came on the project, who had had their fair share of other experiences like lots of greenscreens or lots of decisions made very late in the day, and then coming to this, where the intent and the image is just plain as day, well, it was a real revelation for them. You’ve got the starting point, you’ve got all the things you need, even without the extra sources of data. The crew genuinely found it magical coming on this.

b&a: A classic example of this different type of workflow seems to be for a typical simulant shot where you needed to show the character with a hole in the head and the robotic form around the head. Is it really true that, generally, it was filmed kind of ‘run and gun’ without the benefit of clean plates?

Charmaine Chan: Yes, it is true. Gareth just went and shot all the footage. Fortunately, we had Andrew Roberts supervising on-set making sure we’d be able to work with what we would be receiving. It came back to us as an edited cut of the movie and we dug into it. Just to note, ILM is very good at this. But it was all about, ‘Alright, let’s look at the plates,’ since that’s what we had to work with. We knew it would be a lot of paint to remove the back of the head and we knew it would be a lot of roto to make sure that we retained as much of the actor’s head and face shape, but then also a significant manual matchmove to get our CG to track with the actor seamlessly.

There were occasionally small tracking markers drawn on their faces which gave us a basis of the tracking. But overall it was about keeping it as simple as possible. And, remember, this is what we all had to do a long time ago before we ever used to capture so much data.

b&a: The effect of it though, which is clearly what Gareth was going for, is a naturalistic feel, not just from camera movement, but just plugging into whatever the lighting is. I guess in the end the plates are heavily augmented, but it doesn’t feel like they are.

Ian Comley: Yes, and his approach to filming several scenes was also to keep the camera rolling. The DOPs, Greig Fraser and Oren Soffer, would light these beautiful scenes–well, it’s quite often natural light, but they would use bounce cards and such to make the space beautiful–and then Gareth would let the camera roll and roll, not call cut, not interrupt. He’d say, ‘Alright, I’m going to try it over here. Just keep going, everyone. I’m going to try it over here. Keep going.’ So, because the whole zone, the whole scene, was ready and available with wonderful naturalistic light, he could do that. He had the flexibility to do that.

A scene still from 20th Century Studios’ THE CREATOR. Photo courtesy of 20th Century Studios. © 2023 20th Century Studios. All Rights Reserved.

b&a: The same kind of ‘loose’ approach obviously happened with the robots, too, where they were, I think, extras or actors just performing the scene, and then Gareth would decide who might end up being a robot. This is still kind of amazing to me. But, again, I just want to make sure I highlight the paint and roto team here and all the artists who had to make these shots work.

Charmaine Chan: Yes, we had an amazing global paint and roto team led by Peter Welton. It was a painstaking manual task, but it had to be done, and it was done by some of our best painting artists in the business. Even that shot of the police robots in the grass field, I think that was one artist working on it for almost three or four months.

Paint and roto rarely get the recognition they deserve, but I think they are the complete unsung heroes of this project. We wouldn’t have been able to have such a successful experience, whether it be the robot heads which needed a full paint out, or our simulants with their holes–just seeing a little bit of the background poking through made all the difference.

Ian Comley: We should also mention Abbie Kennedy who was heading up our matchmove and layout team. She and her team had to deal with all those very freeform cameras and do all that match-anim work. There were a lot of evolving choices in the plates, so Abbie and Peter’s teams should get all the kudos for making it work so well. We should also note compositing supervisors Wesley Roberts and Juan Antonio Espigares Enríquez. They’re comp freaks!

b&a: In terms of those robots–which were based on actor in the plate performances–did you still need to do any kind of performance augmentation on them?

Ian Comley: No, not really. When it came to head performance, we had a basis for the head for some of the characters. So I guess we were robotizing performance in the spirit of our pivot points and pistons and things, for certain constraints. But, no, it really was fueled pretty faithfully to what we were seeing in the plate, unless there was a story point that perhaps hadn’t been conceived at the time, that we needed to come back to.

Charmaine Chan: With our full robot heads, it was about the little details of things like, for that little lens eye, does it do a little zoom or does the primantis part of the mouth do a little bit of a quiver? It is that attention to detail that our animation team was able to achieve that I think helped push it and give it that extra feeling.

Madeline Voyles as Alphie in 20th Century Studios’ THE CREATOR. Photo courtesy of 20th Century Studios. © 2023 20th Century Studios. All Rights Reserved.

b&a: When it came time to integrate CG work, either for the robots or the simulants, what were you finding worked to integrate that into the plates?

Charmaine Chan: Surprisingly, it was interesting when we did get HDRIs–because we did actually get them sometimes–it didn’t quite fit with the scene. I think this was because they may have been for giant location environments, and what we needed was literally just the area where the head was around. So if your head was a chrome ball, what are the colors that are actually getting very close to you? We found it was getting the reds out of Alphie’s costume or the browns out of the farmer’s outfit. Being able to tie those specific colors in with the headgear is really what sold it.

b&a: When you did have to insert some robots into plates, you mentioned in the talk some pre-rendered in-engine robot kit imagery. Tell me about that.

Ian Comley: Juan Enríquez, one of our comp supes, put that robot kit together. As we were very much in the world of this changing picture of who was going to be a robot on any given day, depending on the shape of the edit, Juan put together this approach in Nuke. We had the robot kit designed and built, full 3D ready, with hero sections of robots, or in fact full body robots, that we could use wherever we wanted. But this was reproducing shared versions of those in-engine.

It meant a relatively straightforward match-anim in comp could be used. Now, if any background performer was moving too much, it was clear that we needed our full match-anim process, but if it was a relatively static background character, say, aiming a weapon, or performing a simple action, then we could keyframe that in Nuke, not taking too much of the compositor’s time, and we had that kit available.

b&a: There was something I liked, too, about the simulant faces–like Alphie’s or Harun’s where the lip of the skin connected with the metallic surface. It really seemed to help sell the gag of a camera coming up to one of them and then they turned their head to profile and you saw what they were.

Charmaine Chan: We talked a lot about, how do we make a simulant look like a simulant, but also still look human? Whether that be keeping the front of the neck, to adding a little bit of a lip at the edge, because we’re just so used to straight cutoffs. It’s that little attention to detail, to show that they took the time to make simulants. They really wanted it to be as human as possible, but we still needed to signify the technology.

So we had artists who were painting those little edges between skin and metal because we just wanted that slight touch of a highlight. We got renders of it, and sometimes they worked great, but sometimes we just needed that additional touch of really highlighting that lip, to signify how it’s separate but integrated.

A scene still from 20th Century Studios’ THE CREATOR. Photo courtesy of 20th Century Studios. © 2023 20th Century Studios. All Rights Reserved.

b&a: I thought some of the vistas in this film were so great, and is it right that the approach there in terms of VFX was kind of similar, where something was filmed, and then production designer James Clyne would do paint-overs, and then it would come to VFX?

Charmaine Chan: Yes, with Gareth, he had such great ideas of what it was going to be, but he didn’t know exactly what they would be during filming. It was more about creating that world, first, by just simple photography, and if he can get that sense just from his photography alone, then adding the additional structures or robots isn’t going to change it that drastically. It’s just something that’s already ready for that world.

Ian Comley: Something else I loved was that Gareth was so great at finding these little day-to-day moments. Things like, in the village, someone’s giving themselves a shower from a bowl of water, children playing, just these little inserts, and they’re given exactly the same screen time as this amazing augmentation that we’ve just spent weeks or months on. But I think that’s wonderful in a way, because it is not throwing anything away, it’s leaving the audience with that increasingly evolving, growing, residual impression that this world has depth, and almost wherever you put a camera, this thing exists. It’s got life outside the particular narrative journey that we are on. Whereas I think many other films would be somewhat gratuitous on that, and you’d have the 15 shot fly through of the thing, just because we can. I think it’s really powerful the way he does it.

b&a: One of my favorite shots is during the attack on the floating village, and it’s when that tank pushes over the houses on stilts. I honestly thought that may have been a partly miniature shot when I watched that the first time. It feels so real. I am guessing it’s not, that it’s fully CG, but tell me about the evolution of that kind of shot.

Ian Comley: Gareth had this great starting point of people running, and in the background, the wonderful Cambodian stilt houses, and then the word tank flashes up–that was the brief. We would start building out the houses and we could piece them together to create something that matched the spirit of the huts that we were seeing with the plate. It was ultimately better for us to just fully remove what was in the plate, to construct our own clean plate for the stilted houses.

Then with all the layers of reeds and grasses and people running, and foreground stilts and things, it was then about navigating the maze of paint, roto and CG augmentation. Again, it’s still classic visual effects. It’s detective work and problem solving, and then a fantastic FX team coming along and producing a really nice, intricately layered sim at the top.

b&a: By the way, I want the toy, or even the model kit, for those tanks.

Charmaine Chan: Exactly! We all want them, too. But I think this also harkens back to Gareth being a ’70s, ’80s kid. He was influenced by the original Star Wars, and those were all miniatures and models and all the kits that we’re so familiar with. Even though he was trying to make his own thing–it’s not Star Wars–you can still feel a lot of the influence he’s had from watching how Star Wars was made. And of course, he already got to make a Star Wars film.

b&a: Just finally, there’s great use of ILM StageCraft in the final sequences of the film. It seemed like the perfect use of LED wall tech, where you’re seeing out into space. How much pre-work was involved in getting Helios imagery ready for these scenes?

Charmaine Chan: It was a lot of work. It was a huge team. We had Amanda Johnstone-Batt and Roel Coucke as our virtual production/VAD leads. They had a team of folks, including James Mohan, who was also our CG supervisor on the ILM StageCraft side. They worked tirelessly to create these environments in a real-time scene. It was pretty incredible to see what happened. I think this is the most animated environment we had, with the airlock and escape box. All those rotations plus a lot of blinking lights–there were just a lot of moving factors to it, but it made the scene so rich.

Again, I think because Gareth is such a visual person, if we shot that in front of a greenscreen, we would spend ages trying to figure out what the framing and composition of exactly what Gareth wanted, but because that scene was there for him, he could literally frame everything up exactly as he wanted.

b&a: I know he’d obviously done Rogue One, but had he shot on a volume before?

Charmaine Chan: No, he hadn’t. That was his first time.

b&a: I’m assuming he was a bit of a natural at it, though…

Charmaine Chan: I think for anyone who shoots on the volume for the first time there’s a learning curve. It’s never what you expect, and you have to wrap your head around the fact that what you see on this screen isn’t exactly what you see in your camera. But I think because Gareth has done V-cam sessions at ILM, and already knows the concept and ideas behind it, he adapted to it very quickly. It’s no different than any other location. It’s just a digital environment and our tools are designed specifically for filmmakers and how they like to work..

b&a: I dug the tentacled utility bot that features in that final act. It should have its own movie. But I wondered how that all came about, because it’s a huge sequence just in itself.

Ian Comley: There’s a couple of things to mention about that scene. The first was our digi-spacesuits and digital Joshua [John David Washington]. We had a physical version of the spacesuit built for parts of the shoot where they had plate work and wire work, but we knew it would be a complicated scene, so we did Joshua and the suit in CG. Then, for the utility bot, I think Charmaine should talk about that.

Charmaine Chan: This was where, once again, Gareth pulled from his ’80s collection. He said, ‘Aliens’, and I’m like, ‘Yes, excellent movie.’ We went through quite a few iterations of what the utility bot design actually looked like, pulling from a lot of insect-like designs. It’s a little bit of a spider, a little bit of an octopus, but at the same time, because it is a bot, it needed to be a lot more rigid in its movements.

There’s also actually one scene, one shot in the movie, where you see the bot before this sequence. Right before they hit the panic button on the NOMAD, and everyone is escaping, there’s one scene where all the people in the white biohazard suits go into an escape pod. The bot is off on the side pushing an escape pod into another hold. That’s the first time we see it, and we’re like, ‘Oh, it’s actually a handy bot.’ Because we were wondering, ‘What does this bot do?’ We needed it to do something, and that became the backstory with helping with escape pods.

And then, of course, it went from its helpful state to a, ‘No, now I’m going to attack and keep Joshua away from Alphie state.’ It gave that nice Alien-esque feeling to that whole sequence.


Subscribe (for FREE) to the VFX newsletter




Leave a Reply

Discover more from befores & afters

Subscribe now to keep reading and get access to the full archive.

Continue reading

Don't Miss

Behind the miniature T-6 Jedi Shuttle in ‘Ahsoka’

Adam Savage from TESTED chats to modelmaker John Goodson and

Issue #17 of the mag is here! Go on set with VFX supervisors, learn about LiDAR scanning and camera arrays

The latest issue of the print magazine looks at on