How the stop-motion studio is using A.I. to deal with face seams and much more.
If you’ve never seen how Laika carries out its stop-motion character animation before on films such as Kubo and the Two Strings or Missing Link, it’s a sight to behold. Essentially it involves replacement animation, where a puppet’s face or head is replaced each frame with a different face or head with a slightly different expression. Of course, played back, it makes the character appear to talk, smile, emote etc.
Replacement animation is, of course, not an uncommon thing in stop-animation at all, but Laika has taken this art to new levels thanks to 3D printing and its incorporation of VFX and CG into the process. Oftentimes, only part of the printed puppet face is replaced each frame (since only the mouth may need to move). But what that means is that ‘seams’ or ‘split lines’—the connection between two face parts—are introduced, and need to be painted out for the final shot.
Enter: Laika’s visual effects team, and specifically, the roto/paint crew. They have become a key part of removing those seams in the studio’s films, starting with Coraline in 2009, using rotoscoping and digital painting methods.
In addition, these VFX artists deal with elements such as rigs used in the animation process (both holding and even attached to the puppets), cleaning up chatter between frames, color correction, and even eyelid enhancements. All that work has traditionally been done with standard rotoscoping, paint and wire removal tools. And just like any roto or paint work done on live-action films, it can be a laborious process.
It’s something Laika wanted to change…
Changing the approach
“The thing about rotoscoping is that it’s very, very important,” Laika visual effects supervisor Steve Emerson told befores & afters. “But at the same time, it is also a task that folks aren’t always crazy about doing. What we really wanted to do was find ways to do things that are more artistic and free up time for the artists.”
Having seen developments in machine learning and A.I. in visual effects in recent years, Laika was keen to see if the repeatable task of rotoscoping could be somewhat automated, especially for the seam that runs across the eye-line and around the eyelids of the studio’s puppets owing to the replacement animation approach. Could a tool be developed that helped with both tracking the key features of a puppet’s face, and then helping to define the roto shapes that aid in all that necessary clean-up work?
Not only that, Laika knew that it had a lot of data on-hand in terms of how it currently approached rotoscoping. Perhaps this data could, in the spirit of current machine learning techniques, be used as training data for any such tool?
It turned out that Intel’s Applied Machine Learning team, which happens to be based very close to Laika in Portland, Oregon, was looking for a visual effects problem like this to solve with machine learning techniques. Intel’s team, led by Narayan Sundararajan, was given a breakdown on how Laika approached the rotoscoping task. They then set about devising a A.I. tool that could provide for ‘automation’ to the roto and paint process.
Laika, itself, was tasked with providing Intel with the right training data for the proposed tool, based on the studio’s countless hours of roto work. “One of the things we learned in that process is that there is a tremendous amount of work that’s not perfect,” acknowledges Laika production technology director Jeff Stringer. “It’s just what you have to do for that particular frame. So we spent a lot of time just refining those data sets down to the essentials, figuring out what the landmark tracking points would be. Even agreeing on the terminology for these things took a long time.”
“We even went down a rabbit hole of trying to photograph the puppet heads,” continues Stringer. “We put the heads on a stick, put them in front of a robotic camera and took a bunch of different shots, kind of like a Light Stage, to see if we could use that data. But it turned out that the data we really needed was very task-specific. It almost had to come from roto. And the more we zeroed down on that, we learned that we could build all the data we needed from a low number of shots. It turned out to be about five or six shots was enough.”
What this training data provided were ‘ground truth’ roto nodes, which defined what the ideal shape would be for doing certain fixes to the puppet animation. In particular, significant work went into how those shapes could be created from as few number of tracking points as possible.
“Intel got really good at finding the tracking points,” says Stringer. “We were like, ‘That’s good enough! Just give us those, we’ll take it from here.’ But they were like, ‘No, we can figure out how to generate these shapes.’ And they did.”
Ultimately, what was produced became known as the ‘Intel prototype tool’, and works within Foundry’s Nuke. Based on the training data, it provides artists with tracking nodes for key points on the puppet faces and then use the points to infer a series of overlapping mattes or roto shapes that are spatially and temporally smooth across the shot. Then, every shape and tracking node can be editable in Laika’s standard tools, such as Mocha in Silhouette.
Laika’s lead roto paint, James Pina, worked closely with the Intel team on developing the tool. “What the tool outputs is fully finished roto,” he explains. “It’s a lot faster than we could get if we were doing it by hand—it’s hours of work in a few minutes. We also get the trackers for the seams and these will power the Furnace scratch repair tool that we’re currently using. Then we’re able to take that data, export it out into Silhouette if we want to modify anything. We can take those trackers and we can refine our roto in Silhouette to get it perfect.”
Pina adds that obtaining the tracking nodes and roto shapes in this automatic way means artists can “spend the rest of our time tweaking the color and artistic side of it to get exactly what we want out of the puppet’s performance and make it look as good as we can, as opposed to spending all our time just struggling to get the roto done.”
Laika is wary of saying this is a push-button result in any way. “It’s not like you’ve got frames going in and finished roto coming out,” states Stringer. “What we’re really just trying to do is put tools in the hands of James and his team, so they can do things faster and more efficiently.”
The plan is to continue to refine the tool. Already there have been a few extra capabilities achieved out of the collaboration, including a simple bounding box tracker and a ‘motion2roto’ node, which allows artists to apply motion vector data to minimally keyframed splines to create fully animated articulated roto splines as output.
Emerson believes so much more can and will be done in the area—not only in rotoscoping and paint—to the point where these solutions help artists get to the final shots much quicker. “It is a big moment for visual effects right now, what’s happening with machine learning and A.I.”
Laika and Intel are presenting their collaboration at SIGGRAPH 2020 in this Talk.