How VFX took Mark Ruffalo to a whole new level of Hulk in ‘Endgame’.
We’ve seen Hulk in a lot of Marvel films, but never like this. Avengers: Endgame features the fully computer-generated Bruce Banner (Mark Ruffalo) character in a form where Banner has merged his intelligence with the Hulk’s, well, Hulkish-ness. That required a whole new approach on the part of visual effects, which had to capture Ruffalo and then bring a performance to life that was somewhat closer to the human actor than they had done previously.
Tasked with the majority of ‘Smart Hulk shots’ was ILM, which also built the asset. Framestore also contributed Hulk shots to the film. Both studios worked under production visual effects supervisor Dan DeLeeuw. Here, along with DeLeeuw, ILM and Framestore talk to befores & afters about starting with a Smart Hulk test, how refined on-set capture methods and facial capture – including the new ‘Anyma’ system – were utilized this time around, how Ruffalo performed on set, and what else made the CG Hulk a fresh challenge.
The Smart Hulk test
To prove that this new-look Hulk could work in scenes in the film, ILM produced a test. It began with a new Hulk sculpt drawing off of Marvel artwork, and incorporating clear Mark Ruffalo features. Endgame producer Kevin Feige pulled some footage from an interview that he saw online of Ruffalo after the movie Spotlight, in which the actor appears, came out. “That’s a pretty heavy movie and it was sort of a matter of fact, it was Mark talking about that movie,” notes ILM visual effects supervisor Russell Earl.
“I think that seeing Mark just speaking in his normal very sincere, earnest tone was great,” adds ILM animation supervisor Kevin Martel. “We wanted to see what that looked like coming out of what had previously been a more ‘grunty’ creature that didn’t really speak.”
“So we did that test and then we sent it down to the client and they saw it and I think that really gave them the confidence to go forward,” says Earl. “They showed it to Mark and what it showed, I think, is that he didn’t have to try to accentuate or over-perform. He knew that we could get at the character without him having to act more ‘Hulk-ish’.”
On location, Ruffalo generally wore a motion capture suit and a helmet-mounted camera system. The motion capture was handled by Profile Studios and the HMC was from Fox VFX Lab (formerly Technoprops). “The first priority was letting Mark be in a mocap outfit acting against the other actors,” outlines Dan DeLeeuw. “You get a much better performance because everybody’s together.”
“When you have someone like Mark Ruffalo,” notes DeLeeuw, “they will bring something to the character that you won’t always find. And in the interactivity between other actors, you’re going to find something you wouldn’t know if you’d just used stand-ins.”
Things did move between recording full motion capture data and more ‘faux capture’, depending if the shoot was in a contained studio environment or outdoors. “When you’re outside, generally it becomes more of a faux-cap thing because there’s a couple instances where we took camera’s out, but in Atlanta, the weather changes on a dime,” relates DeLeeuw. “So then your towers are swinging in the wind, then your day is not that great.”
For Ruffalo’s facial performance, he would regularly wear dots on his face as part of each scene. The actor had gone through various scans, including a Disney Research Zurich ‘Medusa’ scan which allowed the studios to build a per-frame mesh to base their model and animation on. The idea was all about translate Ruffalo’s performance onto the CG Hulk as convincingly as possible.
ILM, in particular, re-built its facial animation system as part of the Smart Hulk work. “The first part of that was making sure that with solves that we were getting, we could solve Mark Ruffalo’s performance and put it onto our Banner asset,” explains Earl. “It was about getting every little nuance in his face. To get that, we basically started with the solver that we had and we were working on improving that solver, and then by the end of the show we were able to use Disney Research’s new solver, Anyma, which was up until this point used more of an ADR type booth where you had three cameras and a performer would be ‘re-doing’ the performance that could then be captured.”
“We talked to those guys,” continues Earl, “about trying to take the head mounted camera footage that we had, running that through the Anyma solver which doesn’t just rely on the low-res mesh generated off the points – it’s generating a mesh per frame and it’s doing a photometric solve based on the footage from those head mounted cameras.”
“We’d ingest the plates from the head mounted cameras, we would do our solve and do our rigid stabilizations, do our Anyma solve and get a good result. We could look at the plate of Ruffalo and the renders of our Banner and say ‘Yep, yep that’s it. That one looks good.’ Once we had that then the next step of the pipeline that we also completely rebuilt was the re-targeting aspect of that so taking our Ruffalo solve and then retargeting that to Hulk.”
Animating a smarter Hulk
Ultimately, both ILM and Framestore would work on final Hulk shots, with Framestore also delivering test footage as the CG character was refined. From ILM’s point of view, having all that capture, reference, and detailed modeling, rigging, and texturing of Hulk gave them the ability to have full control of the performance in animation. “We could adjust or manipulate any bit of his facial performance,” says Martel. “What that meant was, we had a very solid match to what Ruffalo did – and we would continuously look back at the Ruffalo footage – but when you get it onto Hulk there may be certain things that need to be amplified or suppressed depending on how the performance reads on him because he is different.”
Framestore followed a similarly methodology, and its animation set-up also introduced some new machine learning approaches. “We did this for animation tests before we were awarded the work,” says Framestore visual effects supervisor Stuart Penn. “That involved taking some footage from the head-mounted cameras, doing a pass of some key shots and some key animation, choosing some key frames which matched Mark’s performance, feeding those key frames into the machine learning system which would solve an entire shot and give you a first pass animation, which we could use very quickly to get a version, say, for the temps or for the first edit. Then beyond that we worked into it using hand animation to get the more finessed performance and then right at the end we also extracted some micro-movement from the machine learning to add that as a final pass over the hand animated performance.”
There were certainly moments when the more human-like Hulk this time around presented additional challenges for the VFX artists on the show. “There were many times,” shares Martel, “when we would look at Ruffalo in the plate and you accept it as Mark Ruffalo because you know that it’s him, but his face when he makes certain expressions, it feels like, if you just look at it sometimes he doesn’t look like himself. So when that gets transferred over to Hulk we struggle with that same thing where you would accept it on a live action person because you know that’s who it is, but for our character it can take him a little off model. So we have to play with the dials as far as making sure we get enough of Hulk, enough of the creature still in there and the likeness. Balancing that likeness of Ruffalo and creature is something that we were always very mindful of.”
Smart Hulk, yes, but also funny Hulk
Smart Hulk’s appearance in Endgame also facilities several comical moments that had previously not necessarily been possible.
“Because it’s a comedy performance, that made it trickier,” declares Framestore’s Penn. “Hulk had such a range of expressions – he’s telling jokes, he’s switching from being very serious to being lighthearted, so getting that comedy performance involved some very subtle timing. It’s a balance between how to keep Mark’s performance whilst translating that into something that still performed like Hulk.”
ILM had the comedic shot, for instance, of Hulk walking out to the past-New York street scene to smash things (which he does only tentatively). “You can see in that moment that he still truly is Banner,” notes Martel. “It’s like if it was Bruce Banner walking out into that New York street to smash things. This is how he would probably do it because it’s hard for him to even pretend to be such a monstrous character. It was a really fun moment for us to explore, for sure.”
Explore more of our in-depth Avengers: Endgame coverage during #endgameweek.Sign up to the weekly b&a VFX newsletter