Motion controlled robots
Articles

Motion controlled robots, controlled by…Maya!

Many readers will of course be familiar with the use of Autodesk Maya to animate CG creatures and characters. But what about using it to control industrial robots?

Don’t worry, this isn’t part of some robot apocalypse. Instead, it’s all about a Maya plugin developed by Evan Atherton and Nick Cote, researchers at the Autodesk Robotics Lab. They have released the open-source Maya plugin to let users simulate, program and ultimately control 6-axis industrial robots.

So, why would you do this in Maya? Well, the cross-over between animation/filmmaking/robotics is already abundant in some areas, such as shooting TV commercials and VFX work, in which an industrial-type robots are fitted with a camera on their robot arm to do motion control-like repeatable moves. That may well mean that your current skills in Maya might come in handy for more than just animating CG characters.

Jumping in with a Dropship demo

To get a sense of what Mimic can be used for from a filmmaking point of view, check out this ‘Dropship’ project, in which Atherton combined with ILM virtual production visualization supervisor Landis Fields and a team of other artists to use industrial robots to operate both a miniature spaceship and a camera.

Buy Me A Coffee

The Dropship itself was a Stratasys 3D printed model, the robots were from KUKA, the camera rig was a RED DRAGON and the lighting was a programmable ARRI SkyPanel. First, a normal animation workflow in Maya was used to animate a CG ship to see how the ship would look in relation to the camera. Then, in Mimic, both the virtual ship and the camera had the CG representation of the robot arms ‘attached’ to them. This was done to make sure what was happening was within the physical limits of the robots.

“The next day,” continues Atherton, “we just sent that data to the robots. And we filmed the model that way. It was amazing to see a person who’d never touched one of those robots in their entire life be able to choreograph this dance between two robots with Mimic.”

Check out the gallery below for a look at the shoot for the Dropship project. Also, Patreon supporters of befores & afters can see an exclusive video teaser of the project.



This slideshow requires JavaScript.

How Mimic works

Usually industrial robots are controlled via something called a ‘teach pendant’. It’s essentially a control box – handheld or larger – that allows for movements step by step; it can be very manual. In the industrial space, too, many robots are controlled by custom code. That means different systems for different robots and very little flexibility.

“Historically,” Atherton told befores & afters, “industrial robots have been super, super hard to deal with. They either require really tedious or ancient programming paradigms and hardcore coders. So I liked the idea of having a more design-friendly interaction.”

Mimic is aimed at re-thinking the control of the robots as if they are CG models that you would be animating. This is done utilizing keyframing, curves, motion blending and inverse or forward kinematics. Sound like something you already do in Maya? Exactly.

“You animate the thing you want to happen, and the robot just goes along for the ride,” Atherton observes in one of his Mimic demo videos. “It enables you to not care as much about the robot control and programming and do what you really care about which is your animation.”

The other thing with industrial robot control systems is that generally there’s no control over ‘time’. The robot will go from point A to point B as fast as robotically possible. Mimic is also about re-thinking that side of things, especially because Maya can handle a timeline easily.



“What we’ve tried to do is create a time-based workflow,” outlines Atherton. “This is really necessary in film and media, where you say, ‘What I want to do is this really complex path and I want it to take three seconds and then I want to ramp up and ramp down and do all these complex things that you might do in a visual effects environment.’ You want to be able to just send that to a robot and have the robot play it back.”

A Mimic screenshot for the ‘Dropshot’ project.

Atherton says he was inspired to implement these kinds of animation-centric control abilities by a company called Bot & Dolly (which was ultimately acquired by Google) that provided industrial robots for several art and filmmaking projects, including Gravity. “They proved, at least to me, that there could be a workflow for using more designer friendly tools for controlling robots,” says Atherton.

With that motivation behind him, the research engineer admits he started building Mimic for himself, at first, to practice his Python and Maya skills. The intention was to see if a tool could be developed that made controlling robots more standardized. This went for large-scale manufacturing or science-based use of robots, as well as for an area Atherton had noticed cropping up more and more, and that also matched with his own interests; filmmaking and VFX.

“We started seeing a lot of smaller production studios pop up using robots, and they were developing their own in-house kind of solutions, many of them influenced by Bot & Dolly. They were spending a lot of time developing tools. What we wanted to do, in some ways, was level the playing field, and give more people access to these tools so they just didn’t have to spend time building them and instead just go out and be creative and make things.”

Evan Atherton (left) and Landis Fields on the set of the Dropship project.

Mimic in action

So far, in addition to the ‘Dropship’ project, Mimic has been used in a number of tests and several undisclosed projects on the filmmaking and commercials side. One of the earliest adopters right after the plugin was released was Steam, a production studio in Santa Monica that makes commercials. “They had bought a little robot and they assumed there would be a tool out there to program it,” notes Atherton. “And they found that most were either proprietary and/or custom software. So they were like, okay, how much would it cost us to hire a developer and build this thing? But then they found us and within a week or two, they were using Mimic on their first shoot.”



This studio was making a ‘How to Train Your Dragon’-related commercial. They needed a very specific way for the camera to interact with a hand model. “They had a camera path and they were able to run that exact path over and over and over until the hand model got it down perfectly,” recounts Atherton. “So, instead of having the hand model needing to get it perfect and a camera operator having to do a perfect jib move, they could do that way.”

Steam also relied on Mimic in conjunction with their motion control ‘Iris’ robot for an in-camera effect-filled Los Jarvis Mescal commercial, and a spec Apple Watch Series 4 commercial. Plus they worked on a Hot Wheels spot.

Recently, Atherton and Cote partnered with Perfect Infinitives’ executive creative director John Nierras on a project to demo one of Mimic’s new features: how it can interact with FIZ lens control motors.

“Our goal was to create something that felt really organic and less ‘robotic’, while still taking advantage of all of the benefits of motion control,” says Atherton. “We took an abstract, 3D printed object created by the Autodesk brand team and ran the robot using five different lighting passes that we comp’d together. We turned to Redrock Micro, makers of cinema accessories, who gave us early access to their Eclipse API, which gave us full control of their motors from Maya.

“This lets us keyframe focus, iris, and zoom positions right alongside the robot animation. For this particular project, we wanted to show off really subtle, organic focus pulls, but because we needed to do multiple passes, it was imperative that our focus was precise and repeatable so that we could blend the different passes seamlessly.”

Then there was the 2019 MTV Video Music Awards, which incorporated large on-stage robots holding – and moving with – a series of screens full of imagery. Here, VTProDesign was engaged to design ways for presenters to be ‘revealed’ at the VMAs. Five robots were brought to the show, held in Newark. VTPro describes on its website how this worked: “All five robots activated simultaneously: two robots with LED screens affixed to them, two robots with lighting rigs and one robot to capture presenters as they walked out on stage. The robots were programmed using Mimic and cued using Touchdesigner, mixing and matching paths to achieve desired effects.”



Making Mimic, and what it solves

“The heart of the plugin is a custom dependency graph node I wrote that does inverse kinematics for six axes robot arms, which are slightly more complicated than some other rigging systems,” explains Atherton, in terms of the mechanics behind Mimic. “Beyond that we needed it to be really flexible. So if we want to add a new robot, we can’t have it go, ‘Oh, let’s rebuild a rig.’ It literally had to be like this: you put in the parameters, you put in the lengths of the joints, you hit rig and the robot rigs itself.”

The reason that approach to rigging in Mimic is important, notes Atherton, is that, say if you had a camera in space, there are actually different ways a robot can get there. “So, on the front end of that, I wanted to create a UI that let people explore those things. Essentially a lot of the Mimic UI is just to make it easier for people who aren’t roboticists to play around. So, we do all the IK solves and things like that. And then there’s a couple of buttons that let people flip the different configurations and say, ‘OK, if I’m doing this shot, what happens if I flip the configuration from here?’”

Maya screenshot showing a view of the robots

A Maya screenshot showing a view of the robots for the ‘Dropship’ project.

“Another aspect of it that was really important to me was doing a flexible IK-FK switching architecture,” adds Atherton. “It was all about your standard animation tools that people find necessary, but doing it for the robots. A lot of what I was trying to do was take out the complexity of the robotics itself for people. For instance, as a joint gets closer to its physical axes limit, the joint will turn red. So it was about putting in these visual cues for more artists and designers so they’d be like, ‘Oh, that robot doesn’t look happy, let me not do that!’”

Mimic also ‘takes care’ of writing the robot code, i.e. it all happens in the background after you have done an animation. Atherton explains: “If everything looks like it checks out, you hit export and, based on the type of robot you’re using and the type of program you’d like to make, we write the robot code. There are two main ways you can tell the robot where to go. You can tell it a Cartesian position, say, ‘go to XYZ and then roll/pitch/yaw’. Here the robot controllers themselves can figure out how to get the robot there.”

“The other way is to tell the robot explicitly the angle of each axis. So you can say, ‘axis 1, go to 27 degrees, axis 2 go to…whatever.’ The benefit of doing it this way is that you then know that exactly what you send to the robot is what you get. If you send the Cartesian position, the robot can decide which particular IK configuration it wants to be in, so it could differ from your Maya interpretation. In that sense, we could just take a camera path and export those Cartesian positions and roll/pitch/yaw yacht to the robot. But it might run into itself or it might not be able to reach a certain position. So without the visual indicators of the 3D model in Maya, it’s hard to tell if something’s going to work out on the robot.”

This slideshow requires JavaScript.

Atherton also points out that another typical problem in robotics control is the concept of singularities, something that can be solved with Mimic. “Singularities are basically the equivalent of gimbal lock. So when two axes are locked, the robot doesn’t know how to flip through it. But those are the types of problems you can easily see if you have an animation environment and you have the CG model. You can just tweak the position a little bit. That’s the type of thing that, without the visual indicator, is super-hard.”



How you can try out Mimic yourself

Mimic is available for anyone to download. But, don’t you need a robot, too? Atherton says, yes, obviously, you need a robot, and there are small ones out there available to buy. “We had a group buy a robot from a reseller through eBay, and I think it was around $12,000, which I guess is a lot of money, but when you consider they put a $100,000 camera package on it, then it probably wasn’t so crazy.”

“Part of what we want to build,” adds Atherton, “is a community of people around this area that can support each other. And we’ve seen that start to happen, which is really nice.”

Head to https://www.mimicformaya.com/ to find out more about Mimic and download the plugin. There’s a link to a Slack community for Mimic on the site, too.

The ROBOSCREEN® robotically controlled display is covered by U.S. Patent No. 7,545,108 and other patents pending. ROBOSCREEN® is a registered trademark of andyRobot®.


Become a befores & afters Patreon for bonus VFX content


Join the discussion

  1. Boyan Georgiev

    Super insightful, thanks Ian!

Leave a Reply

back to top
%d bloggers like this: