Meet MotionMaker: Maya’s New AI Animation Tool

10 min read

Say goodbye to time-consuming locomotion work and hello to more creative freedom with MotionMaker, now available in Maya. It’s a new Autodesk AI powered tool that lets you guide your character’s motion with just a few keyframes or a simple motion path and iterate until it feels just right. We spoke with Evan Atherton, senior principal research scientist at Autodesk and one of MotionMaker’s creators, to explore the tool’s origins and capabilities, time savings for animators, and how Autodesk is thinking about AI in animation. 

Q: Evan, great to talk with you. To start, what’s your background and how did you get into animation? 

Evan: That’s a real softball! My name is Evan Atherton and I’m a Senior Principal Research Scientist on the Autodesk Research team, and a mechanical engineer by training. I did my undergrad and master’s at UC Berkeley, and during that time took a course called “Animation for Mechanical Engineers.” It taught us how to use Autodesk tools to animate complex mechanical systems to better communicate our ideas. That class really sparked my interest in animation and eventually filmmaking and visual effects too. At Autodesk, the longest stretch of my time in research was on the robotics team, where we worked on making industrial robots more accessible for design workflows. I got especially interested in how creatives could use them. I even built a Maya plugin to animate robot arms, which pulled me deeper into Maya, rigging, and animation. That’s when my love for this stuff really solidified. 

Q: Today, we introduced MotionMaker in Maya. Give us the elevator pitch. What is MotionMaker? 

Evan: MotionMaker is a new animation system that lets artists direct characters more like they would in a mocap studio, but virtually, inside Maya. You can give high-level instructions, like telling a character to walk, jump, or sit, and guide them through the scene by setting key targets or paths. It’s part of the animation editor toolset in Maya, with a dedicated MotionMaker Editor. That’s where you manage characters, drive them with MotionMaker, and access all the related features. The goal is to make character animation feel more intuitive like giving stage directions to a digital actor. 

It gives animators time back, not to crank out more shots, but to explore, experiment, and really craft the perfect performance.

Q: Take us under the hood. What AI models are powering the tool?

Evan: At the core is a machine learning model we describe as an autoregressive motion generator. It’s built from multiple neural networks. The magic is really in how it fits into the larger workflow. We take motion data from Maya, pass it through the model, and it predicts the next pose, frame by frame, to create smooth, natural movement. A lot of the robustness comes not just from the model itself, but from how we handle that motion data going in and how we translate the results back onto the character in Maya. 

Set a few keyframes or a motion path, hit generate, and get motion to work off.

Q: What kind of training data was used?

Evan: We used motion capture data that we specifically collected for this tool. There are three core datasets: one from two dogs, combined into a single “wolf-style” model, and two from human performers, representing a basic male and a basic female motion style. The idea is to build each model around a specific actor’s performance, almost like capturing their unique motion personality. MotionMaker includes those three base styles, but that’s just the beginning. The system is designed to support additional styles as we go. 

Q: Who is MotionMaker designed for?  

Evan: MotionMaker is designed for anyone in the animation pipeline, whether you’re working on layout, pre-vis, tech-vis, or even hero animation. My hope is that no matter where you are in the process, you can find a way to integrate it into your workflow. 

Q: I hear this started as an Autodesk Research project. What problem were you trying to solve? 

Evan: At the time, there was a lot of exciting research, especially at SIGGRAPH, around motion control for character animation. I’d been following it closely and wanted to prove that this work could already be useful in real animation workflows. The goal was to take that foundational science and bring it into Maya in a way that fit naturally with the tools animators already use. I wanted to show how it could help artists move faster, especially with things like long locomotion sequences. 

Q: Have you had the chance to talk with other animators about it? What are they saying? 

Evan: Yeah, we’ve talked to quite a few animators. One consistent piece of feedback is that they can see exactly where MotionMaker fits into their pipeline. For certain types of shots, it gets them 80% of the way there and then they can layer in their own touches to hit the final performance they want. That’s been really encouraging. The biggest question we get is, “Can I use my own data?” Animators know what they want, and they want tools that give them that control. So, we’ve spent a lot of time listening, not just to feature requests, but to the questions they ask. For example, a common concern was foot sliding. That led us to build a foot slide reduction tool. In early prototypes, we had only one path mode, which was “stay close to the path but look natural.” But animators wanted more precision, even if it meant sacrificing a bit of natural motion. So, we added multiple path modes to give them that control. It’s been a collaborative process. 

Q: What kind of time savings are we talking about? 

Evan: One example I always come back to is a 10-second shot of a dog running, turning, and jumping. I asked our PM, who was an animation supervisor for years, how long that would’ve taken traditionally. His answer? “We probably wouldn’t have even bid on it. But if we did, maybe two weeks.” With MotionMaker, laying out that shot took about a minute. But to me, it’s not just about time savings, it’s about freedom to iterate. You can quickly try different jump timings, tweak a turn, or adjust the pacing of a scene. You could do 10 versions in an afternoon and see what feels right. It gives animators time back, not to crank out more shots, but to explore, experiment, and really craft the perfect performance. And the transitions, between gaits, turns, jumps, those come naturally from the model, no stitching mocap clips together. That’s a huge win. 

What we hope is that this tool can handle some of the stuff animators either don’t want to do or find time-consuming, so they can focus on what excites them creatively

Lay out a shot that would take weeks in minutes.

Q: How customizable is the output? 

Evan: There are two layers to customization. First, the generation itself is pretty art directable. You can tweak the path mode, adjust the character’s facing direction, set when actions like jumps happen. All to guide the motion the way you want. Once the motion is generated, it’s baked to keys on the joints. From there, it behaves like mocap data. You can add animation layers, bake to a control rig, and use all your usual Maya tools to refine it. So, it’s fully integrated with standard animation workflows. 

Q: So MotionMaker isn’t just one-button magic. It comes with a full set of features. Can you walk us through some more key tools? 

Evan: Yeah, honestly, the MotionMaker Editor itself is one of the biggest leaps forward from the original research prototype. That early version worked, but it wasn’t very user-friendly. You had to dig into the Attribute Editor and the Outliner to set keys, control timing, all of that. Now, it’s all visual and much more intuitive, more like a nonlinear editor. You can drop in actions like jumps, move or mute them, and adjust timing in a clean, visual interface. Then there are features like path modes, where you can quickly toggle between different behaviors and see what works best for your shot. One of my favorites is the auto speed ramp. In mocap, you’re limited by how fast a person or dog can physically move. But sometimes, for a shot or stylized moment, you need more. The speed ramp lets you push past those limits without it breaking. And even when it does start to break, that exaggerated motion can be great for superhero-style effects. 

MotionMaker comes with its own Editor because you’re not just generating motion, you’re directing it. 

Q: If you could say one thing to fellow creatives about AI’s role in their work, what would it be? 

Evan: This is a tough one, because the easy answer is just “AI is a tool.” But artists have been hearing that a lot, like, yeah, artists already know AI can be a tool. For me personally, it’s about making sure the creative people are involved in making these tools possible, like the artists who provide the data and the actors we work with. They all know what this is for and how it’s being used. It’s a trust thing. When we went out to capture motion data ourselves, it was important that everyone involved understood what the tool was for, that it was a machine learning model to help animators. On the user side, artists often ask where the data comes from, which is totally valid. So being transparent about how we captured the data and what the tool knows is key. I don’t see AI as a push-button, “here’s your hero animation” kind of thing. What we hope is that this tool can handle some of the stuff animators either don’t want to do or find time-consuming, so they can focus on what excites them creatively: crafting performances, adding intention, all that human stuff. 

Q: So, what’s next? Is this a finished tool or the start of something bigger? 

Evan: By no means is it finished. For us, this is just the beginning. We’re really excited to see how artists use it, get their feedback, and let that guide where we go next. Like, what parts are they loving? What would make it even more useful for them? A big thing on our radar is letting them use their own data and curate their own stuff. That’s a priority. It’s an evolving process, and we can’t wait to see where it goes. 

Tamaskans in motion capture suits were hired in the development of MotionMaker.

Q: Got a ‘did-you-know’ gem about MotionMaker that readers can flex in their next animation convo? 

Evan: Oh, for sure! One of the coolest things we did was with dogs and honestly, it was one of the best days of my life. Did you know the breed we captured is called Tamaskans? They’re kind of rare, somewhere between a husky and a wolf. A lot of people mistake them for huskies, but they’re different. What’s really cool is these dogs are super well-trained. They do a ton of film and TV work, both in motion capture and on set. But their main gig is as therapy and service dogs, so they kind of moonlight as motion capture actors. Honestly, I couldn’t imagine better actors for our first model. 

Q: For those who want to dive deeper, where can they learn more? 

Evan: I’ll be hosting a digital talk on MotionMaker on June 25 and June 26. Join the day and time that works best for you! In the meantime, you can check out our documentation to get started or reach out to me on LinkedIn. I really want to hear feedback. If you try it and feel like it doesn’t quite do what you expected, or it’s not fitting your workflow, I’m curious to know why and what would make it more useful for you. 

Learn more about Autodesk solutions like Maya for animation.

Register for the live MotionMaker digital talk on June 25 at 11am PT/2pm ET or June 26 at 2pm BST/3pm CEST.