The tools Imageworks developed to help give ‘Mitchells’ its painterly style (and to create giant Furbies).
Just as it did for Spider-Man: Into the Spider-Verse, Sony Pictures Imageworks was called upon for the The Mitchells vs. the Machines to craft something different than the usual 3D-animated style of film.
That meant the VFX and animation studio had to craft several new tools to allow for particular illustrative qualities, as well as develop workflows to represent characters, props and environments in a more painterly manner.
Visual effects supervisor Michael Lasker lays it out for befores & afters. He also discusses: Furbies.
FIND ALL THE B&A MITCHELLS VS. THE MACHINES COVERAGE HERE
b&a: This film really does have that hand-painted watercolor style, as well as some very realistic work. What did that style actually mean for Imageworks?

Michael Lasker.
Michael Lasker: They really went after this. They wanted to feel the hand of the artist. It was really the directive from the Sony Animation team. And in Spider-Verse we had a definitive comic book target where none of that was easy. But with this one, the artistry was, we had to kind of find it to some degree with them. They gave us a lot of hand-painted reference. They had a lot of concept art. We worked with them until we all landed on what the final look of picture would be. But we had to invent a bunch of tools to really make the look happen because, we realized this on Spider-Verse off the bat and on this one even more so—turning a painting into moving images reveals a lot of things.
We really had to first break down all the components that would make up the look, because if you look at a painting, you’re like, ‘Oh, I see lines. I see brushstrokes. What’s going on with this spec on the skin? What’s the hair doing?’ And we’ve come up with this method where we look at it, we break it down and we figure out how to achieve each one and really match the painting and then make it move. And the more we’ve done it, the more we’ve gotten good at figuring out all the kinks that are going to happen when things animate, making sure it’s stable. But here I was just trying to be like, ‘Forget how, let’s just do everything we can to match this thing. I don’t care how broken or hard it is or messy it is to get there.’
Sony Animation’s team actually gave shader balls, but in a painterly style. So they actually painted us six yellow spheres to show, how does the light hit it? How is the shadow broken up by brush strokes? What is the line work do in the light, in the shadow? That was fantastic because we have to rebuild the lighting model to work with this, so it was really great that we got that because we had to build a tool that layered on brushstrokes. As the character moves, the light would dynamically move through them—you don’t want it to look like a texture-mapped dot. That’s always the kiss of death with this stuff, with lines and with brushstrokes and with anything you’re adding, you just don’t want it to look mapped.
b&a: What were some of the other tools that you had to come up with?
Michael Lasker: We had a lot of outlines, first of all. And we used some of the facial tools that we used in Spider-Verse, the same tool for expression lines that are more clunky. But we had to build an entirely new outline tool for characters’ environments that had more of a water colory look. So it needed variation, it needed brushstroke breakup, and it also needed to respond to light.
What was really cool is we were able to get a light side and a shadow side, and however you shined the light on the lines it respected, but it would also inherit, the color of the surface underneath it. Like on Rick’s jacket, and we’d also have it always be a little darker than the surface it’s on, on Rick’s jacket it’s yellow. We get an outline that’s a little bit more orange, even though it’s in the light.
We built this brushstroke tool where it dynamically would layer on different strokes as shadows became more dense. We could rotate the light. We were rotating the light around Rick and you’d see the shadows building up and then falling away and they’d dynamically build up all these strokes. And Sony Animation could actually give us the brush strokes they wanted to use and we could feed that into the tool, which was great.
And then on top of that, we did something similar with depth of field where we would layer brushstrokes up to break up in depth because you never wanted anything to be that sharp, or just look too CG. We didn’t want that CG curve. Whether it was lens flares or depth of field, or just our textures in general, that was another thing. We still painted textures, but we painted them in such a way that they would marry with the brushstrokes we were adding on top. It took us a little bit of time to figure out what that style was, where you could have both existing together and not having to stomp on each other.
We also modelled everything a certain way, here it was meant to be irregular. You’d have a little bit of wonkiness, just irregularities that would just play in with the style. If you have an outline on a perfect edge, it’s going to be really hard to make that outline look artistic. But if you have a really nice irregular model out of the gate, that just helps you off the bat.
What was really interesting about everything we had to do, we had to sort of scale it down as the movie went on and as we moved from the home of the Mitchells, which was, you look at that kitchen, it is irregular, nothing straight. It looks lived in. As you go toward the latter half of the movie and you go into this robot world, everything has to get cleaner.
When I first started the film, they were like, ‘Okay, we need the human world and the robot world, and the characters have to exist in both.’ If you watch act three where they’re running around in Cal’s lab, or even in the mall, we’ve scaled it down to cleaner lines. We wanted the hand of the artists, but we also just wanted to evolve it over the course of the movie so you didn’t really notice, but you kind of noticed that they were just cleaner with a more robotic environment at the end.
b&a: I remember on Spider-Verse there were some machine learning techniques in Imageworks’ pipeline for the face lines or for outlines. Was that something continued here?
Michael Lasker: What we did on Spider-Verse with machine learning is we had so many different outlines going on, but anytime we wanted an outline on the bridge of the nose or on the chin, or just anywhere where, as you move, you really need the line to dynamically change, because if the line stays in the bridge of the nose, it’s going to look stuck on, but you want it to sort of naturally grab the profile. We didn’t end up having to use that on this, because those types of lines, we just didn’t rely on as much. It was less ink-liney. It was more just watercolor-based lines. We kept our expression lines with the animation tool, but we didn’t really need to use the line for it.
b&a: I really loved, too, that some normal things you might do in CG in a typical animated film, like a CG tree, now had to feel more painterly. Or there’s Rick’s fur on his jacket. But did you still have to run through the CG process to get there?
Michael Lasker: No matter how stylized these projects get, you still run through the same processes. You have to do them slightly differently. So those trees, well, fortunately, our first idea out of the gate worked. We had developed this tool on Spider-Verse for Gwen’s world that created this, you didn’t see that much of it, but it was her very artistic brush-stroked world. So when I rolled onto this show, I was like, ‘That tool would be great for all the vegetation, but we need to just keep working on it, this tool.’ What we did is we, because grass and trees take up so much of the frame in a project like this, we built the trees and what we did is we built leaves, but we also built interior solid shapes, like core shapes.
You’d have the shapes and the inner core that eliminated any negative space. And then you’d apply the simplification tool to it. It would really merge it all together into just color, just blotches of paint. And we did the same thing on the grass. So the grass was all groomed, like real grass. And then we just simplified it all together because you need that underlying detail to be there, to make the simplification work. It’s funny, you need to be detailed in order to simplify it, because if you don’t have any detail in there, when you simplify there won’t be anything there.
The funny thing about the trees, they had these little squiggly shapes that looked like hands. We called them hands. And what we did is we just took a bunch of planes we called post-its and just stuck them in the trees and then constrained them to just the camera. So when the camera would look, it always be flat on and you just see those old hand shapes and that kind of added to the illustrativeness.
But with Rick’s collar, it was actually a combination of hair and geometry with squiggled opacity map textures on it. Because first we went all fur, and it did not work. So then we went all geometry and that was all right, but it was like a hybrid mix of it that worked in the end.
b&a: I even felt like the exhaust smoke trails from the robots when they’re flying just felt like a different way of approaching that kind of thing. Were those exhausts done slightly differently?
Michael Lasker: Oh yeah. Pav Grochola, who was the FX supervisor on Spider-Verse, was on this, and he’s so good at a stylized effect. For the robot thrusters, we did a combination of four or five different FX piled on top of each other. We had 2D artists, we had some 2D FX artists that we hired that actually would animate when the robots would take off, where you’d get flashes of very shard-y shapes. So they would animate those in 2D to the motion. And then on the simulation side, Pav and his team actually simulate these very polygonal vapour trails. We’d have these great tests of just watching a robot flying, just seeing these polygonal vapour trails. And they just came up with this really articulated way that looked really cool. It was semi-transparent, kind of with like blue hue.
And then on top of that, we had sort of like additional polygonal vapour trails coming off the arm. It’s kind of like a plane’s wings where you get those vapour trails. You had that. And then on top of that, we had these inner core thrusters that had these diamond shapes. They had like an outer outline and internal diamond shape. And that was kind of like the heat in the thruster. You’ve got the internal graphic heat, you’ve got the polygonal vapour, you’ve got the vapour trails, and then you have the 2D sparks when they’d take off. And you see that like in, when the robots first turned bad and they fly and they destroy the auditorium, you’d see a lot of that takeoff effect in that sequence.
b&a: Then in that final aerial battle there’s just a color explosion everywhere. That’s obviously lots of FX sims, but is it also 2D-drawn stuff as well?
Michael Lasker: What we would do is we would get the simulation. We used artistic sprites on the sim. So all of our smoke, we’d still have some brushstrokes mixed in. And once we approved the sim, the 2D FX artists would then paint over top of that. So we would get the motion down with the work of the scene and the action and then the 2D artists would sort of enhance it on their side and then we would take their elements and combine them in comp.
The colors are just off the charts in that. There was like a color palette that we wanted to hit so in a lot of those shots we’d start off with one color arrangement that wouldn’t work. And then we’d sort of switch it up because some colors would grab your eye too much. And they wanted to stay in magentas and blues and pinks a lot of the time. And then you’d have the sun setting when Linda comes out with her bandana on and an arrow on her back so you get all this gold colour, but then you need the explosions to sort of marry with that, which was kind of cool.
b&a: I think it’s very important that I ask you about Furbies. Just from a visual effects point of view, what were the challenges of the Furbies?
Michael Lasker: Okay, well, the challenges were, we needed to make them little, but then we also needed to make a massive Furby, right? We did two grooms. We groomed the regular Furby size and we basically matched them in an illustrative way to how they really are. But then with the huge one, we really had to make it seem like that fur had weight, and the way he moved and the fur would move, because this guy is like 20 feet tall, 30 feet tall, at least.
I think it was just sort of making the big guy look big, selling that, but not overdoing it because you didn’t want the fur moving to be distracting. But we had a lot of different texture areas on these Furbies. We had different spot colors and values. And Lindsey actually had one that we actually used as our reference, so that was super valuable. But the close-up on the eyes, when it first woke up, we spent a lot of time on that shot. Just how the shadows cast, how hard edged and realistic we wanted it to look versus illustrated, because we wanted it just to say Furby. A lot of fun, a little creepy, but a lot of fun.
Become a befores & afters Patreon for bonus VFX content
What was used for grooming the fur on the Furbies?