A new excerpt from issue #14 of befores & afters magazine, which covers in-house VFX tools.
When MPC was tasked with turning giant robots into vehicles and beasts, and vice versa, on Transformers: Rise of the Beasts, one of the first big challenges they had to face was dealing with the change in ‘matching’ geometry between transformations.
To do that, the visual effects studio built a Transformation tool that could slice, separate and transform geometry and be something that animators and technical animators could use in the VFX pipeline.
“This was really one of the hardest things to get our head around,” notes MPC DFX supervisor Erik Gonzalez.
“One of the biggest core things about the transformation process is that once you have a model of a Transformer and it has to transform, for a lot of the exterior pieces like a chest plate or an arm plate, or anything like that that’s large that might need to move or separate in some way, we need an efficient and pipeline-friendly way to essentially slice and dice the geometry and change the model from what we’d started with. That right there–changing the model–is the crux of the problem.”
Changing the model
This idea of changing the model ‘disrupts’ the typical visual effects pipeline; that’s because, at least in the case of MPC, there are usually steps for validating whether or not a rig works with a certain model for animation. “If animation gets passed downstream to, say, the lighting department,” explains Gonzalez, “it’s really important that the model that animation use is the exact same one that lighting’s going to use. Which means that as soon as anything gets changed and sliced and diced in the shot, it changes everything from there. It’s almost like you have an entirely new asset.”

This was the issue MPC had to grapple with, that is, creating a pipeline that was able to push shot-specific model changes downstream, and do it in a way that would still work. Plus, the studio sought to provide a tool for animators to actually accomplish the slicing, dicing, off-setting and transforms.
Gonzalez credits TD Thomas Rutter for the main architecture behind the Transformation tool, which also received significant input from animation and CG supervisors. “What we initially did was hijack the techanim pipeline. It already had the mechanics in it that would allow custom rigs to be made by the team on a per shot basis. It’s called the rig FX pipeline, which they usually use for cloth simulation or for model changes to insert and use a different type of model at the shot level that may be slightly different and have a slight deviation from what we have in our asset that’s built with our general rigs.”
The idea was that this could then allow animators to do the slicing, dicing, offseting and rotation of geometry as they crafted a transformation. “That would then,” outlines Gonzalez, “feed in procedurally into a new rig FX setup that would output a new cache of the geometry on a per shot basis that would work with our current pipeline to be able to push downstream to our lighting department. Importantly, it would retain a lot of the attributes, a lot of the textures, a lot of the lookdev that we have in our normal pipeline.”

What artists worked with
The Transformation tool effectively became part of MPC’s animation toolset in Maya, although, notes Gonzalez, it was largely invisible. “That way, animators were allowed to focus on just the actual creative animation work, really being able to go through and choreograph all the different reveals of pieces, all the slicing of pieces, to get the most bang out of the buck for these kinds of shots.”
“That was the real magic of the animators,” adds Gonzalez. “They had to go in and grab specific pieces and move them and say, ‘This needs to animate this.’ That one piece they may slice into four other pieces and then four of those pieces may get sliced again. There were exponential possibilities in what we could really actually change on any of these Transformers. The animators had a lot of control and luckily the tool was built up in a way that they didn’t have to worry about caching out certain things and getting that geometry back into the pipeline. Once a shot was ready out of animation, it would be published through the tool, the caches would get sent to lighting, and then on the lighting side there was a bit of a check that they had to do to make sure that everything would shade and light correctly.”
One challenge MPC had to deal with was related to those thousands of possible new pieces that could be generated, as Gonzalez discusses. “As you can imagine, if you have a character that’s split into a million different pieces and suddenly, half of those pieces are new geometry that weren’t shaded or textured for some reason, it immediately creates a lot of work, again, almost like a whole new asset in that respect to texture and lookdev.”

That’s where the procedural nature of the Transformation tool came into play. “Our lookdev team was able to make sure material assignments survived after animation would publish. Then our lighting team would just take a few days to do a crosscheck which we built into our schedule. They would make sure that inside faces of newly split geometry still maintained some texture and some shader qualities.”
“We also had a really good visibility system set up,” continues Gonzalez, “because half of the work on the animation side was not only slicing, dicing and animating certain pieces, but also animating visibilities on certain pieces as they moved behind the camera and moved behind the character to where we didn’t need to see them anymore. We had to establish a really good visibility primvar pipeline to be able to be used in lighting so that when pieces are in lighting, we weren’t rendering every single piece of geometry that didn’t need to be there.”
Read the rest of this article in issue #14 of befores & afters magazine.






