So…you need a humanoid robot in your film? What are your options?

November 10, 2021

The VFX supervisor of ‘Finch’ outlines the considerations made in bringing Tom Hanks’ robot companion to life.

Miguel Sapochnik’s Finch, currently streaming on AppleTV+, takes place in 2030 in a world made almost uninhabitable after a solar flare event. Finch (Tom Hanks) is surviving with his dog, Goodyear, a rover robot called Dewey and a humanoid robot named Jeff.

Actor Caleb Landry Jones performed Jeff on set, with the final performance making use of a Legacy Effects-animatronic and displacement suit complete with ‘Jeff’ pieces, on-set motion capture, and CG animation (the principal VFX vendors were Mill Film–now Mr. X–and Rising Sun Pictures). Hanks, too, performed Jeff motion capture in order for aspects of his character to be imbued into the robot.

Here, production visual effects supervisor Scott Stokdyk runs down the various methods considered for how Jeff would be realized, and the ultimate plan achieved during the shoot and in post-production.

b&a: I’m always fascinated about the first considerations or first discussions you as VFX supervisor have about how to do a certain kind of effect. What were the initial ways you thought originally about bringing Jeff to life?

Scott Stokdyk: I’m always fascinated by that, too. It’s one of the things I look forward to on a project. I’ve been a big robot film fan and have wanted to do a robot movie for a long time. From surveying the VFX landscape when we started, the best practices were to have somebody physically on set to perform with the other actors, to perhaps do on-set motion capture, and also to try to get lighting reference in camera at the same time. I’ve tried different elements of all those pieces in other movies and this one was incredibly well-suited for all of them to be combined.

The ultimate approach was that we had a displacement suit worn by the actor Caleb Landry Jones that Legacy Effects built. They had also built a real working animatronic Jeff, which was part animatronic, part puppeteered. And so they had all these manufactured pieces available to build the displacement suit.

What we did was to take a MVN Xsens lycra suit and put velcro on it and attach Legacy pieces that would help form Jeff’s silhouette and influence his physicality. If Caleb bumped into something, the displacement suit would help him interact in the proper 3D space. These pieces were also camera-ready for lighting reference, since they were derived from the real Legacy manufacturing process.

And at the same time, we also put a Jeff robot mask on Caleb. And then we had stilts on him, too, because he was supposed to be six foot six, roughly. This meant Caleb had to learn to walk on those stilts and he had to learn to perform with a mask on, which are two things that actually helped make his performance more robotic.

The other thing was, I was also very interested in Jeff’s evolution from being more robotic to more human – at the start of this project, we thought he was basically going to turn into a robot Tom Hanks over the course of the story, since Jeff is essentially mimicking him during his formative period… that meant we should also have a displacement suit for Tom Hanks later in the movie!

b&a: What could you shoot with Legacy’s creations?

Scott Stokdyk: During the more robotic part of Jeff’s development, our intention was to use Legacy wherever we could. The head of Jeff at the start, that’s all Legacy. We enhanced just one of those shots with just some CG eye movements. The torso coming down was also Legacy and the special effects department. We thought early on that we were going to be able to sneak in more Legacy shots, more over-the-shoulder kinds of shots at least…

We started shooting this movie mostly in chronological order. So the first scene Caleb’s in, we get to it and just straight away, he physically embodies the robot in such a crazy unique way, that we chose to mostly give up the idea of using the animatronic robot. We found ways to sneak real physical robot pieces in throughout the course of the movie, but it was mostly, ‘Okay, this is what Caleb did. Let’s figure out how to use this.’

At the end of the movie, we still motion captured Tom Hanks. Right at the end on the Golden Gate Bridge, actually, Tom Hanks is in a couple of those shots in the mocap suit.

But as everything evolved in editorial, as the story was being refined, we realized it was so much like a real father-son relationship – that Jeff wasn’t going towards Finch, he was going towards something different, albeit influenced by Finch. Jeff had his own persona and it was different from Tom Hanks’ Finch performance. It was inspired by it, but it was unique.

b&a: Just to go back to the early considerations, you wanted to use motion capture and the MVN suits, what did that end up giving you?

Scott Stokdyk: Doing this show we had access to Technicolor resources, and they had a lot of experience with the MVN suits on other shows. They were able to show me the ins and outs of the data from the suits and how it translated to a CG skeleton. I could see that the data was good enough to be a useful tool.

But we didn’t have a big budget on this movie, so we were cutting and streamlining, and looking, where can we save money? And we actually saved some budget by not having the sensors in the displacement suit for the first part of shooting Jeff. We were thinking, ‘He’s going to be so robotic here. Our animators are going to be doing this all by hand. Let’s not rent the sensors. Let’s not bring our mocap guy. Let’s delay him a couple weeks, just as part of saving some money.

And then at some point, as Jeff was getting more human, we said, ‘Okay, from here on now, we’ve got to have this suit data.’. Then we rented the sensors, and started capturing data.

Another thing to note is that there is this quad robot Dewey in the movie and he was almost fully Legacy Effects – a real robot. They say it was the most complicated robot Legacy had ever built. And it performed amazingly in sand, blowing dust, everything. When I saw what it could do, I was, like, okay, the AI brain of that Dewey robot is at the level of five or six human puppeteers’ intelligence.

So, how has AI for robotics evolved between now and 2030? Jeff has more advanced technology than Dewey. It’s as if Dewey was the test robot for Finch, and the tech was radically improved for Jeff.

Which in my view meant that the AI mind for Jeff was the puppeteers from Legacy and the 60 animators from MrX – all their brains combined. It was also the animation supervisors, the VFX supervisors, our director Miguel, and a lot of it was Caleb. So Jeff’s AI performance became this hybrid of many varied inputs.

b&a: I have to admit, Scott, while I was watching it, I was sitting there wondering, ‘Is that a puppet from Legacy or is it a digital Jeff?’ I honestly couldn’t tell.

Scott Stokdyk: We were trying to keep the audience guessing and we were trying to trick our colleagues, people like you. We were thinking, ‘Where can we sneak in a piece of Jeff’s displacement suit into the shot. For the head we thought maybe we could just sometimes add glass lenses in the eye housings to enhance it?’ I think we maybe only got one or two shots in there with the practical face mask, but more often we were successful in getting Caleb’s real hands/gloves in the shots.

As Caleb started acting with the mask on, he became more physical immediately. He would act with his hands. Miguel and I said, ‘We need to keep those real hands.’ It often makes VFX shots harder to do. It’s easier to just paint out the actor and do something from scratch on a clean plate instead of meticulously rotomating every little movement of real gloves and adding back internal metal pieces.

But I found that it kept more reality and connection every time we kept Caleb’s on-screen gloves in the VFX shot. Every complex hand interaction you see in the movie, almost all of those, I’d say, ‘We have to keep the real gloves.’ In a way it was a type of motion capture – reality capture, really. And I think it paid off in terms of that it did keep people guessing.

For example, there’s obviously the big scene at the end with him and Tom Hanks together, an emotional scene. There’s a lot of interaction. I insisted, ‘Okay, these actors have to really be touching. They have worked together for months and months and months. This is their moment.’

b&a: When they hug, for example, how was that shot?

Scott Stokdyk: Basically every shot was the same methodology in terms of putting Caleb and Tom together. We let them do their performance with one or two camera operators shooting. And then we walked the actors out and had the camera operators with their handheld cameras try to immediately replicate their camera moves for a clean plate. It got more complicated when the clean plate had to have Tom Hanks in there. But he’s obviously a very experienced actor and was great at it. We said, ‘Okay, you just did this emotional hug scene. Now, basically, give us another take where you go through the same motions – but it is not for performance, it is for paint work!’’

The really tricky thing was when Tom was touching the robot in areas that didn’t exist in the real world. And those are always the most difficult shots. It’s just a challenge of really good secondary animation on a very, very intricate level of getting cables and mechanical pieces to conform to Tom’s hand. Sometimes we had to also change/move Tom’s hand very slightly. And then finally it was a matter of good lighting and great comp integration there.

b&a: What about when Jeff’s wearing that parka?

Scott Stokdyk: Well, in pre-production we thought, ‘Is there any way we can save some Jeff shots by having him in a real parka if we’re behind him and we don’t see any robot?’ And we may have gotten a couple of those, but mostly we had to do some additions, because you might see a little sliver of Jeff’s head or another piece of the body here or there. But in order to make that parka work, we had to have a different displacement suit that Caleb wore, that fit under the parka better and made a nicer silhouette when the parka went over it. The shoulders were slightly different, but it still had the same bumps, the back had a similar spine. And in the real world, it wouldn’t have been 100% accurate to what would have happened if the robot had put on a jacket, but it worked visually.

b&a: With all that in mind, just to go back to that first scene of Jeff learning to walk, what did you attempt there, and what did you ultimately use?

Scott Stokdyk: So, that is the first day that Caleb came out onto set in the displacement suit. We had Legacy on standby with their animatronic and the puppeteers were close by and we had Caleb there. Caleb started performing with Tom and what became instantly apparent is that we knew we were going to keep what Caleb was doing. We could see the roots and the seeds of an amazing performance there.

So, we very quickly said, ‘Okay, the Legacy robot is a lighting reference. We’re going to see where we can use it from here on, maybe in some driving shots.’

It turned out the Legacy robot helped us in some unique other ways… The first is that we did motion capture on Legacy’s animatronic motion. And that was important because as much as Caleb performed in a way that had an arc of more mechanical to more human, we also knew there were things he couldn’t physically do that were in the robot. So we put the Xsens sensors on the Legacy robot, calibrated him, and had the puppeteers go through this whole range of motion routine. And it gave us little nuances to weave in mostly early, mostly in the walking scene, of where he resets and you get this very subtle motion of the way the servo motors are kicking in and stopping. That was a nice touch where we benefited from Legacy’s work.

b&a: And you mentioned, of course, that Jeff is ultimately mimicking Tom Hanks’ Finch. Can you talk a little bit about what you did shoot with Tom in terms of mocap?

Scott Stokdyk: There was so much that happened in pre-production. I usually try now to focus on the artistic part more and the technical part less, but for this movie I did this deep dive into machine learning and artificial intelligence.

A lot of the research that’s going on, that I believe is going to be in robots in 2030, is actually being done in current SIGGRAPH research. It’s being done in the area of applying and modifying motion capture, and there are all these great Style Matching research papers. A lot of these papers address what to do if you do motion capture of someone and you want to abstract that style and apply it to some other action. I came up with this backstory that Jeff is always optically motion capturing Finch to gather his machine learning data.

And in the future, I think there is going to mostly be optical capture without markers. You can already do this with some level of success… I used some open-source software and ran through tests of getting a skeleton from a video source only, and it looked pretty decent. So we had these discussions with Caleb at the start where we said, ‘You watch what Finch is doing in this scene. And then in the next scene, you can do actions that you saw in the last scene. If you haven’t seen Finch do it, early in your development you should not do it.. You can’t scratch your chin before you’ve seen Tom scratch his chin.’

That became the guiding idea for Caleb. He embraced that. He was also much more mechanical in his movements at the start of the movie, and then by the end sequence at Shiprock where they’re having the picnic, he had more of a free reign. By then the guidelines were, ‘You’re evolved now. You can do whatever you want.’ That meant Jeff had basically abstracted everything he needed, and he could transfer his own style to any move. So, a lot of the machine learning evolution was driven by Caleb’s acting choices.

The second thing we did was to have a separate motion capture session with Tom Hanks. We’d say, ‘Okay, you’re driving. How would you do this? How would you check your side view mirror? What’s your grip like?’ Our VFX Jeff was matching to a human being who happens to be Tom Hanks, who specifically is playing this character Finch. So we just had to be very, very tactical about, ‘Okay, let’s take this movement we call a Finchism. This is what Finch does in our movie.’ And we’d capture that wherever it was appropriate, and the animator’s were able to use that Finchism and weave it into their shot performance.

But at the end of the day, I feel it would be a disservice to the animators to over-promote the motion capture. A huge part of Jeff’s performance was in the millions of keyframe choices made by all the talented animators at Mr. X! We were most often faithful to Caleb’s head and hand position, but everything in between had a lot of thought and artistry added to it.

Lindsay MacGowan from Legacy Effects with ‘Dewey.’

b&a: The Dewey robot, you mentioned Legacy made that. Were there paint-outs required or other CG replacements?

Scott Stokdyk: Almost nothing. The crazy thing was, Legacy only had the budget to build one robot! So, this robot is priceless, right? He comes with his own crew, and the special effects team is blowing sand and dust and dirt, and lifting Dewey up and dropping him down. And so everyone’s on pins and needles. And anytime, say, one of his legs slips by accident, the puppeteers have to run over and check him and it’s a big deal. But he came through in an amazing way, and looks great on-screen!

There were very few times we had to do something for Dewey in CG. For one claw grab that we wanted to be a little bit more elegant we replaced a piece of him…and also when Dewey gets damaged we couldn’t afford to hurt the real one. There was also one shot that we came up with in post and editorial long after we wrapped. And that’s when Finch throws the ball for the dog and the robot chases out past him – that was just, ‘Oh, wouldn’t it be cool? Well, we should have gotten this. Well, we have a CG one….let’s make this a sweet moment.’ And we did.


Subscribe (for FREE) to the VFX newsletter




Leave a Reply

Discover more from befores & afters

Subscribe now to keep reading and get access to the full archive.

Continue reading

Don't Miss

Don’t miss Legacy Effects’ showcase page for ‘Guardians of the Galaxy Vol. 3’

Featuring creature designs, concepts, fabrication, costumes and props.

Watch RSP’s VFX breakdown for ‘Dial of Destiny’

The VFX studio showcases its New York parade VFX, which