Autodesk Vision Series Delves into AI and Its Role in Production   

9 min read

AI in Media & Entertainment panel at SIGGRAPH 2023: From left to right Rachael Appleton, Autodesk; Phil Barrenger, Rising Sun Pictures; Nikola Todorovic, Wonder Dynamics; Jen Hollingsworth, Flawless AI; and Mike Seymour, fxguide.

While there is significant hype, and some fear around the use of AI in Hollywood, one thing is clear: AI is paving the path for a future where technology enhances productivity, and fuels boundless creativity.

As part of the Autodesk Vision Series at SIGGRAPH 2023, Autodesk’s Director of Media & Entertainment Business Strategy, Rachael Appleton, moderated an enlightening panel around the rising possibilities and impacts of using AI and machine learning in production. The discussion “Exploring the Transformative Power of AI in Media & Entertainment,” looked at AI’s trajectory and potential from the perspective of those making the tools and those using them in production:    

Reframing the role of AI in VFX

The conversation demystified language and controversies around AI, and addressed common misconceptions about how generative AI works in a real-world, practical sense.

Mike set the tone for how to think about AI, describing, “I often talk about a mobile phone. If you are using it as a phone, it’s a phone. If you’re using it as something to watch a movie and that’s all you do, it’s actually a TV. And if it doesn’t work and you use it to open the door, it’s a door stop, right? Technology is what you use it for.”

It’s crucial to rally around the talented filmmakers and artists who are using this technology to achieve their creative goals. He continued, “It’s so important that we don’t allow the conversation to minimize their role by pretending that AI is doing it all.”

Jen echoed the importance of centering on filmmakers when talking about and developing new technologies, bringing her unique perspective from Flawless AI which operates as both a production company and soon-to-be a SaaS provider. The shared vision in the industry is for AI tools to enhance existing creative processes. She explained, “we really believe that we are going to transform filmmaking, so for us the question is how do we make this approachable for people? We’re making sure that we fit into workflows, budgets, and timelines, because we don’t want to disrupt a process that has been in existence for so long that it prevents adoption. We want to create an easy entry point for people to then scale with us, so we can be part of their process of making really great films.”

Phil offered a complementary perspective, touching on the unique partnership that Rising Sun Pictures has with the Australian Institute of Machine Learning (AIML) at the University of Adelaide. The collaboration with AIML came about when working on the feature film Elvis, where footage of actor Austin Butler needed to be retrofitted onto real-world footage from decades ago. Rising Sun Pictures also offers degrees in the visual effects field in conjunction with the University of South Australia, which has proven to be an invaluable training ground for new artists entering careers in VFX for many years.

Phil shared, “Collaboration became a prototyping of how machine learning can fit within the production workflow. For us, the journey over the last five years has been asking how do you take the technology and actually infuse it through the artist? Rather than a displacement of jobs, we ask what are the new opportunities? And how do we get this technology into the hands of the artists so that they can determine how their jobs are going to evolve?” He noted that relationships with educational institutions are opportunities for the industry to reach young professionals, which can lead to more innovation and smoother adoption.

Overcoming budget constraints

Nikola and Jen also addressed how AI relates to the topic of budget constraints, an all-too-common challenge in the visual effects industry. Nikola and his creative partner Tye Sheridan – both self-described sci-fi geeks – quickly learned that the otherworldly films they yearned to make could not be accomplished on an independent budget. That was the motivation to start Wonder Dynamics and their AI-powered toolset, Wonder Studio.

Wonder Dynamics’ AI tool Wonder Studio automatically animates, lights and composes CG characters into a live-action scene.

“When I worked in indie films, I was frustrated that due to budget constraints you couldn’t really achieve your vision,” explained Nikola. “At first, we wanted to build these tools for ourselves, selfishly, because we wanted to make the movies we were writing, but then we realized that this was a bigger problem than just us. So, the mission of the company has always been getting indie filmmakers to do what traditionally only studios can do with CG and heavy visual effects. At the end of the day, the artists and filmmakers are the ones that are going to be using these tools, so we’ve got to build them inside of the industry, with the artists, to see what they need.”

Coming from her experience on the studio side, Jen added her thoughts on the value that producers and studio executives can also see with AI.

“For 15 years, I would sit in greenlight meetings so excited by the script and vision for the movie being greenlit. And I never heard at the end of the process, ‘this movie turned out exactly like we envisioned’ or ‘this movie is better than we thought it could be.’ It is always that it’s not as good as it could have been, and that’s all because of the budget constraints, the time constraints, just getting things coordinated.”

Enhancing artist workflows

Nikola worked with the Russo Brothers on the recent film Electric State, which brought many of the goals of Wonder Dynamics into a real-world studio workflow.

“Learning firsthand from the Russos and their VFX team really helped us to more deeply understand the pain points and to tailor our product to the artist. We worked with the artists and focused on what their pipeline is, which creative applications they’re using, and how they can get the data they need most efficiently. And because of AI, we’re now able to get them all the necessary data much faster – the machine learning models will extract everything from body animation, facial animation, light estimation, camera tracking, etc. – and they can start on their 3D scenes much earlier.”

Mike reflected on his experiences testing Wonder Studio and the benefits it can bring to artists. “What I found really great when I was first playing with Wonder Studio, is that it gets folks over the hump of getting started with AI. Because the question for many filmmakers is, how do I even get into this thing? And so the fact that you can try this and see the value right out of the gate is rewarding.”

The panelists also explored how AI fits into creative workflows and makes artists’ lives easier –  providing greater creative freedom. Mike explained, “Eventually I’m going to want to use Maya because I’m going to want to do high-end character work. But I might have to spend a year to even get to the point where I can show my wife what I’m working on, and she can see how cool it looks. But if I could get to that point on day one, that is exactly what we want.” Rather than AI tools displacing artists, it acts as a shortcut to get artists to their desired visuals.

Phil added, “I think that’s one of the most exciting things about SIGGRAPH this year, is how many of the DCCs are implementing AI tools to help artists get closer to their vision and start interacting with it right away. This helps artists build up an intuition, which will slowly spread to building the intuition for what data goes in. One of the nice things about AI and visual effects is that we can really heavily refine the data toward the output that we want.”

AI brings new opportunity

Furthering the topic of creative freedom, Mike offered a refreshing view of AI and the evolution of technology as it relates to VFX jobs.

“It took 350 people to make the first Blade Runner film, and by the time they made Blade Runner 2049, that grew to 3,000 people. There just isn’t any historical basis that facilities are suddenly getting rid of tons of people. We have always found ways to take entertainment technology and use it to make more entertainment technology. Just look at what’s going on with the streamers in terms of the quality of visual effects work in television, which is spectacularly better than what it used to be.”

Nikola hopes AI technology will help to democratize the visual effects industry, so that more people from all over the world are able to tell stories even if they require high-concept visuals.

“I think we’re really at a fundamental evolution point in this industry, but I personally see it as super positive. This is a really hard industry to get into, and there’s a really high barrier to entry. I don’t think the ability to tell stories should be tied to socioeconomic status; stories should come from everywhere. But right now it’s so expensive to do that. So that’s where I see this technology as a means to an end. Let’s have these stories come from all over the world.”

In a recent Autodesk News blog post, Autodesk discussed that integration of AI services developed using NVIDIA Picasso, a foundry for building generative AI models, into Maya. Autodesk is also teaming up with Wonder Dynamics to deliver an integration between Maya and Wonder Studio—a tool that harnesses the power of AI for character-driven VFX workflows. AI has played an important role in boosting VFX workflows for many years, and we’re excited to be a part of generative AI taking things one step further.

Want to hear more on AI-powered workflows in post-production? Get valuable insights on emerging technology and make connections with industry experts at AU 2023. Get a free Digital Pass today