The on set VFX approach on ‘The Creator’ was different from most

May 21, 2024

The challenges of capturing gray and chrome balls, HDRIs, and reference imagery on the run-and-gun Gareth Edwards film. A new excerpt from befores & afters magazine.

When on set visual effects supervisor Andrew Roberts from Industrial Light & Magic first met with The Creator director Gareth Edwards and producer Kiri Hart to discuss the making of the science-fiction film, there was, he recalls, even at that early stage, a desire for the production to be as slender as possible.
“Gareth said, ‘There’ll be six vans, and we’ll drive around and we’ll pull over off to the side and we’ll capture something, and then we’ll jump back in.’ That was the mindset they had, that it would be very lightweight, very organic,” recalls Roberts.

Indeed, The Creator would eventually be made with what might be considered a very unconventional approach to the usual visual effects-heavy film. It was crafted in somewhat of a ‘run-and-gun’ style, shooting over a number of countries with minimal built sets, and often with a view to developing production design and visual effects after principal photography.

Ultimately, the movie would feature over 1500 VFX shots, ranging from environments to CG robots, to significant actor augmentation for simulants, and impressive effects simulations. The Creator received an Academy Award nomination for Best Visual Effects in the process (the nominees were Jay Cooper, Ian Comley, Andrew Roberts, and Neil Corbould). But it was done, as you’ll see in this article, differently from most large VFX productions.

“I remember Gareth saying early on,” shares Roberts, “that ‘I don’t want the tail to wag the dog. All of the mechanics and behind-the-scenes operations that are necessary to make a film, I don’t want those things to dictate my filmmaking or interrupt. If I’m ready to roll and QTAKE isn’t up and running, we’re not going to stop.’ I saw this as it being important to him that all the focus be on that connection between him and the actor. If he’s caught that perfect moment, then we’ll roll and the other things can catch up.”

“He made it clear that he was going to move quickly, that it was going to be light, and that it was going to be ‘performance first’, and that, as the on set VFX supervisor, I might not get everything that I would like,” adds Roberts. “But we were in good hands because with Gareth’s VFX background, he understood all of the things we would normally want to do. But he did say, ‘You won’t necessarily get an HDRI or a scan or photo reference for every take. Let me know when it’s really important.’ He just didn’t want those things to get in the way.”

One reassuring aspect for Roberts related to the first tests carried out for The Creator. These came after a scout shoot Edwards had done in several Asian countries, where he captured candid footage of people and scenes. ILM then took that footage and generated simulant and robot shots (ILM executive director and senior visual effects supervisor John Knoll oversaw that work).

“Given that Gareth went and shot that material, and the team at ILM were able to create those beautiful visuals without any supporting data, it was reassuring that they had completed those shots without any on set reference,” notes Roberts. “I talked to John, too, and he had said to me, ‘If we really have to, we can use old school methods to support what Gareth’s vision is and to give him the visuals he needs.’”

Trial by fire on set

Once production got underway in Thailand in January of 2022, Roberts knew that he had to be ready to jump in to capture as much VFX reference material as possible after every take. While that’s of course always the mission of the on set visual effects team–who also tend to work under time pressures and do not want to hold up filmmakers–here Roberts was mostly working alone, and needed to fit in with the guerrilla style of filming Edwards had pushed to follow. However, it would soon become even more apparent how ‘run-and-gun’ the production would be.

Andrew Roberts on the set of ‘The Creator’.

“Our first day of shooting was at the AI factory, where you see the robots being constructed,” outlines Roberts. “Alphie’s there with Joshua and they also meet Drew, who is looking into the mechanics of Alphie’s head. Gareth had a piece of machinery, that represented the hole that goes through Alphie’s head, and that was mounted on a stand, plus he had attached a diopter to the camera. There wasn’t a lot of space with Gareth and the focus puller and the first AD and the actors in there. I was standing by the door, and I motioned to the first AD, ‘I need to take some measurements when you’re done with that setup,’ and he gave me the thumbs up and then went back to what he was doing.”

“I’m there, taking a few reference photos from the performance. Gareth rolls for a while, then he’d have them reset while he moved around the room hunting for the perfect angle. At the same time, I’m realizing I’ll need to capture a lot more of that room. Then he says ‘Cut!’ and the actors walk away, and then Gareth and the others are walking out with the camera equipment. I’m like, ‘Whoa, whoa. I need all of that!’ but I realized I’d missed my chance. They were moving so quickly, and it was just the first day!”

Another challenge was Edwards’ preference to allow for rolling takes, ie. not calling cut between each take. “Gareth had said to me that when you say, ‘Cut’, people will run in and they’ll start adjusting costumes and touching up make-up, and doing all sorts of things that will break the connection that the actors had with each other and with the moment,” says Roberts. “So instead he’ll quietly say, ‘Reset’, and then have the actors go again and again. Sometimes that results in–and I think this happened on day one–a take that was over 30 minutes. And not the same angle for 30 minutes, Gareth will move around and by the fifth or sixth take, we’re looking in the opposite direction. For me, it was like, ‘How do I record this information? How do I organize it?’ There’d be almost 12 unique shots within a take like that. So keeping track of the performance and scene was very important.”

The VFX surveying and reference that Roberts would normally have attempted to do in the seconds and minutes immediately after a take (and he certainly did do this on many set-ups during The Creator’s production) include capturing gray and chrome spheres, HDRIs, clean plates, and more reference photography. It often requires a connection with the rest of the on set crew to ensure they don’t move any set pieces or have people walking around during a capture.

Roberts quickly established that connection as the crew continued to work together, partly with an unorthodox solution. “I was using an app on the iPhone called Make it Big, which you can just type a couple bits of text and it will fill the entire frame. I would put in ‘HDRI’ and then make sure people saw that—‘Okay, we’re not going to touch it until Andrew gets what he needs…’  So after that first scene, where I didn’t get exactly everything I wanted, I was just about always able to shoot an HDRI or a clean plate. It was a wake-up call for me, that first day.”

Tools on set

To aid in capturing VFX reference imagery, Roberts enlisted his own brand of streamlined gear. One of the centerpieces was an iPad running FileMaker Pro. “It included the VES template database that let me take screenshots from QTAKE and record what I was observing in the scene to build a record of what’s been filmed,” says Roberts.

“Then I also had my 5D Mark IV Canon DSLR around my neck, which I’d use to shoot reference photos. I often had a 14mm lens on that, which was just wide enough to capture the lights and the general layout of the scene. On a separate tripod, I had a Canon 5D Mark III, with a ColorChecker Passport Photo color chart that I could bring into frame and that I had presets for, if I was going to shoot a virtual background or HDRI.”

Roberts credits the team back at ILM, including capture area tech lead/layout supervisor Dacklin Young, for helping him develop a number of HDRI presets. “These were for daytime, for dusk, and also for night, because we were going to be shooting a fair amount that was in blue hour and in dark conditions. I set a bunch of presets on the camera so that I could very quickly just create a new folder, rotate to that preset for the lighting conditions, drop my tripod in and fire it off.”

“I had to get really efficient at doing that quickly,” continues Roberts. “I had times that I could do a virtual background in about three minutes, and that’s where I’ve got a 50mm lens and I’m rotating 16 positions and tilting up, then tilting down and creating a super high-resolution sphere. For HDRIs, I could do that in just a few seconds from three positions.”

In fact, Roberts had done a test run before he flew out to Thailand. “I wondered, how quickly can I get this down?’ I knew I might only have a few seconds to capture something. In the end, I got pretty good at flying in and then grabbing what I needed, and then getting out of the way. After that initial scare, I got into the rhythm and made sure that I was communicating with the people who would give me the information.”

Once in the swing of things, Roberts was further able to capitalize on production’s use of the iPhone app ZoeLog for recording camera lens and other data. He was able to subscribe to the feed of entries and see what data was coming out of the Sony FX3 used by DOPs Greig Fraser and Oren Soffer, and often operated by Edwards himself. “It gave me info like ISO, frame rate, and f-stop. Then it could be exported to a CSV and other formats. Sometimes Gareth was shooting with a couple of cameras. There might be a drone, as well as the camera that he was operating, so having access to ZoeLog info was a great backup if I needed to run off and capture something elsewhere or if they’d already started shooting again and I was capturing the HDRIs.”

Roberts also pays tribute to Edwards and the on set team, including the likes of DOP Oren Softer and his gaffer Pithai Smithsuh, for allowing him to acquire as much reference as possible. “There were times when animated LED lights had been set up and they would generously turn them on and off for me to get HDRIs for each light scenario. Also, Gareth would often shoot a scene and once he was happy, he’d hand the camera to me and I would shoot tiles or mimic his move through the space, shooting a clean plate that would provide the negative space in backgrounds VFX would need.”

Another tool Roberts utilized on set was the Scaniverse iOS app. This came in handy for helping to capture LiDAR scans of the sets and props, without the usual LiDAR equipment. “We did consider shipping a LiDAR unit out to me,” says Roberts, “but I believe the model that we had access to didn’t easily allow you to pull the data off. I knew that the amount of scanning that I’d planned would quickly fill up the device during the five months I was out there. This meant the scanning happened largely with my iPhone.”

With Scaniverse, Roberts could walk around props and small locations to scan them. At the time, it tended to work best for scan durations of around five minutes or less, with the results exported to a variety of 3D formats. “It was certainly not at the resolution that a traditional production scanner would provide, but, in addition to my photos, it was a good backup to have as a representation of the 3D space.”

Read more in the print issue of the magazine here

Leave a Reply

Don't Miss

Crafting a cast of superheroes and villains for ‘The Fantastic Four: First Steps’

Issue #47 of the magazine is out in print and

Mandibles, missing legs and a whole lot of creatures

Behind the scenes of 'Predator: Badlands' in issue #46.

157 pages of VFX and filmmaking behind the scenes on ‘Sinners’

Issue #45 of befores & afters is here and it

Discover more from befores & afters

Subscribe now to keep reading and get access to the full archive.

Continue reading