‘Elvis’ featured a ton of VFX you may not have noticed

Visual effects supervisor Tom Wood breaks down the big VFX challenges on the Baz Luhrmann musical.

Baz Luhrmann’s Elvis, starring Austin Butler, is a film full of invisible effects–there are CG landscapes, digital crowds, environment extensions and even some seamless machine learning-based face replacement shots.

Overseeing this array of VFX work was visual effects supervisor Tom Wood, who shares with befores & afters the different aspects of re-creating key musical and family moments in Elvis’ history.

b&a: As a VFX supe, it would be really cool to talk about breaking down the script and all the planning that goes into it. Can we go back to the beginning?

Tom Wood: I got a call from Warner Bros. and they hooked me up on a Skype call with Baz. Then I was flown to New York to go and meet him. We started work in August 2019. In New York, we had a week there going through the script and that script was actually never shot. It changed completely by the time we got to it.

b&a: Clearly it’s a film with a diverse amount of visual effects–set extensions, a lot of CG work, different kinds of things. Going into it, what were the kinds of conversations you had with Baz and the production team?

Tom Wood: Crowds was one of the big challenges. I committed to 3D crowds, which is pretty standard now. I downloaded and learned Blender to do some temp work, and a few things got into the film as well.

By doing that, I pushed back the decision making process for Baz. It meant he didn’t have to decide on any particular actions during shooting. He’s obviously very actor-driven. He’s on set, he’s really concentrating on the performance and the absorption for the audience. That was good for me to see that and to see how well that worked as well, in terms of how emotions drive the scene.

By committing to those 3D crowds, it meant that I didn’t really care what he did with the camera. He could go anywhere, and I could cope with that. I was always saying we can’t go too close to the crowd members. And we never did.

b&a: Because there are so many different types of sequences and types of VFX, was it the kind of show that lent itself to any previs or techvis?

Tom Wood: Yes, Baz works with Chris Tangney. He’s a savant-ish visual effects guy who runs Unreal Engine simulations for Baz. He’d already built a Beale Street, a Vegas showroom, parts of Graceland. He also works with production designer Catherine Martin. Baz could then fly around Beale Street day and night and set up camera and lighting positions. DOP Mandy Walker had Chris also do lighting simulations for where the set was going to be built.

The most previs’d sequence was the ’68 special, the comeback special. Baz wanted it to go into a fantasy world and the camera would spiral outside the set, looking in, on Elvis performing in a crowd. Chris would do these crazy, crazy shots and from that we would do these techvis simulations.

Also, there was a big camera move craning in from Beale Street, across the street and in through a window of Club Handy to find B.B. King and Elvis after the bar is closed and they’re all chilling and singing and playing together. We did multiple techvis passes of what we could do with the Technocrane, where the camera could start and where it could finish.

b&a: What about when you needed to go to fully digital environments, such as, I’m assuming, many of the Vegas ones?

Tom Wood: Vegas itself looks nothing like it did in 1968 or the early ’70s. So we had to go back to original pieces of footage for reference. It was all about the big neon signs, the big marquee signs. Rising Sun Pictures took exterior Vegas and the hotel. Mr. X (now MPC) had the interior of the hotel, the showroom, with all the crowds. Method Studios (now Framestore) also did crowds for Russwood Park and Beale Street. Then Luma Pictures did work at Graceland and the Hollywood sign. We actually had 14 vendors!

b&a: I didn’t want to ask you about all the different shots and vendors, but I saw that Rising Sun Pictures had used some interesting machine learning techniques for a few face replacement shots, where they incorporated Austin Butler into some of the archive material.

Tom Wood: We actually started that in 2019 with them. They worked with the Australian Institute for Machine Learning, and they ran a test within a week, finding source footage of Austin Butler and source footage of Elvis and transposing them. That was amazing to see. The footage they found of Elvis wasn’t at 24 frames, it was like a sped up version of Elvis, and about 10 or 12 cuts together. After the machine learning process, they all worked, bar a couple of them, including shadows being cast across Austin’s face from the scene. It all worked really well.

b&a: I think those kinds of shots reflect the diverse nature of work you had to do in this film.

Tom Wood: Yeah, it’s a fun aspect of Elvis and a fun aspect of Baz, as well. One interesting part of the work included shots that Luma did. There’s a drive to Graceland and there’s two shots of the pink Cadillac approaching the house with Elvis and his mother and father and grandmother in the car, through the windscreen, which was shot for a completely different scene. We said, ‘We can make it work, but we have to replace the car. We have to replace everything.’

Then there’s the Lisa Marie jet on the tarmac scene, by Fin. That was filmed on a stage on the Gold Coast, which is one of the smallest stages. Baz said, ‘I want it to be huge. It’s like this spy exchange in the middle of nowhere.’ We worked hard trying to make the lighting work for us, so we went with a gray flat light like it had been just after some rain.

b&a: I really like hearing about that because the whole films sounds like a big problem-solving exercise, and you’ve came up with some pretty neat solutions.

Tom Wood: I mean, the overriding thing I’d say is that those solutions need to match the emotional content of the scene, as well. That’s the key, finding something that works emotionally.

Leave a Reply