Animating the Frontier: Realism and rigging in 1923 with WeFX

7 min read

Image courtesy of WeFX

On Taylor Sheridan’s Yellowstone prequel series, 1923, the Dutton family and other protagonists are under constant pressure from the encroachments of the modern world. One mistake, and they could end up shot, drowned, frozen to death, or mercifully, arrested. It’s fitting then that when WeFX had to complete 20 VFX shots in about three months for a highly dramatic horse chase sequence for Season 2 of 1923, they faced pressure from modern expectations and little margin for error.

The state of the media and entertainment industry today imposes tighter timelines and smaller budgets on VFX houses yet demands their quality to be as high as ever. WeFX is a full-service VFX studio out of Toronto that works with the agility and flexibility of a boutique studio while producing blockbuster results. It has built a solid reputation for creature work, environments, and FX, as seen in projects like What We Do in the Shadows, Reacher Season 3, John Wick: Chapter 4, and recently on Wednesday Season 2, where they worked on exciting creature work including CG Thing throughout the series.

WeFX Head of Technology Laurence Cymet says that the studio’s boutique size of about 100 team members helps them communicate and collaborate effectively. “Everyone knows each other and works closely together across all departments,” Laurence says. “No one is siloed.”

However, WeFX also stays nimble enough to scale up when larger projects call for it. The studio’s workflow revolves around the Autodesk Media & Entertainment Collection, including Autodesk Maya for 3D character and creature rigging and animation, and Autodesk Arnold for rendering. The M&E Collection’s comprehensive support for open standards like USD also enable WeFX to render more complex scenes and to work seamlessly between other DCC apps and with other VFX studios. In addition, Autodesk Flow Production Tracking is the core of the studio’s project management and communication framework.

An emotionally-charged horse chase animation

All of WeFX’s advantages would come into play for its work on Season 2 of 1923. During the Autodesk webinar breaking down the studio’s VFX shots for 1923, WeFX Executive Producer Steve Stransman said that of all the sequences they were bidding on, the toughest one by far was the horse chase sequence. In this suspenseful scene, Pete Plenty Clouds’ horse collapses from exhaustion and Pete (portrayed by Jeremy Gauna) topples violently to the ground. It required seamless continuity between the practical horse and actor, and a photorealistic 3D rigged horse and a “Digi-Double” of the actor.

“Working on horses for a Taylor Sheridan production is not something that we took lightly,” Laurence says. “He and his team work closely with horses every day, so they would have an incredibly discerning eye.”

Horses, however, are neither willing motion-capture performers, nor very easy to portray convincingly. On the webinar, WeFX Head of CG Igor Avdyushin described how they had to “build a horse from the inside, out.” They started with four 3D scans of the horse with saddle, without saddle, and with mouth open as well as closed.

Horses are difficult to portray because for one thing, audiences are familiar with live-action horses and will notice if something looks off. Also, they don’t stand still or let you position them however you want when scanning them. Nonetheless, from the 3D scans they started modeling and rigging the horse. A dedicated rigger started with a basic quadruped rig and developed it from there.

During animation, getting the horse’s motion just right was a major challenge under a tight timeline. Because the horse needed to feel desperate right up to the moment of impact, the animators had to capture the intensity and realism of the movement in a close-up shot. Fully animated in Maya, everything from the horse’s muscles and breathing to saliva, blood, and flying dust had to be perfectly in sync.

The animators had some trouble getting the right deformations for certain poses, so they worked in sync with the rigger to add extra controllers to the meshes. When that still wasn’t enough for certain details like the muscles in the horse’s neck, they brought in the sculpting department to help.

This exemplifies the kind of cross-departmental communication and synchronization that Steve says is one of the best aspects of the WeFX workflow. Ryan Ng, WeFX VFX supervisor, adds that “the level of collaboration on this team is unique.” CFX artists work in sync with animators and simulation artists. Their internal simulation system produces repeatable results, which help CFX artists to get a shot faster and if the animation changes later, it’s still following the same basic framework. “It’s actually better to work in tandem in this particular scenario,” Ryan says.

Collaboration across teams

In another example of cross-departmental collaboration, the modeling, look-dev, groom, and lighting artists worked together to nail the complex look of sun-lit close-ups. These shots revealed details in the horse like rippling muscles and veins, wavy and fuzzy textures of the hair, and even the glistening wetness of the horse’s eyes, which needed to show that the animal was in great distress.  

To achieve a 1-to-1 likeness with the practical horse, Igor says they had to dial in the hues, textures, and materials quite a lot, as well as use shading tricks in Arnold to bring out highlights in the horse’s millions of tiny fine hairs. WeFX even labored over the precise details of the saddle and accessories, such as the fuzzy fabric of the blanket under the saddle.

Rendered speechless

WeFX’s collaboration-heavy workflow continued straight through to the rendering stage. Lighting Supervisor Viduttam Katkar explained on the webinar how complex of a rendering job it was, with 15 to 20 render elements in a single shot. Viduttam’s team had identified a key shot early in the process, which they used to establish a foundational lighting setup.

They had many layers of FX that had to interact with each other, such as an entirely CG background that included “tons” of grass, foliage, and rocks. There were FX layers of dust, as well as saliva, blood, and vaporous breath from the horse.

Katkar and his team rendered everything using Arnold CPU rendering and the Deep EXR 3D rendering file format, which provides precise control when layering multiple elements together, from multiple samples stored per pixel. He said rendering took about 40 to 60 minutes per frame, with a file size of about 2GB per frame for a 5K render.

Yet all that really counts in the end is the result: a tense, riveting, stunning scene. And it achieved WeFX’s goal of cinematic realism. The final scene that aired on Paramount+ sparked online and social media threads from fans who were concerned that a real horse may have been killed for the show. They couldn’t tell whether or not the horse was CG.

OpenUSD for the win

WeFX has a tight-knit team, including many artists who have worked together since before the studio was founded in 2020. That kind of interpersonal dynamic can’t be bought by hiring rockstar producers or downloading the latest software update. However, open standard technology like OpenUSD does allow them to collaborate easier between departments. USD’s layering scheme enables multiple departments, like layout and lighting, to work on a scene simultaneously and nondestructively.

Laurence says that USD support also helps them to render complex scenes. “Arnold updates are like Christmas for me,” he says, also adding that improvements to Arnold’s volume rendering and global light sampling save them time.

Artist empowerment from AI

Judging from the quantity and quality of WeFX’s projects, its working methods have been serving the studio well, but they’re evolving along with the standards and expectations of the industry. To continue advancing their output volume at the highest quality, WeFX’s R&D department is exploring the emerging AI tools that are improving workflows while leaving artists in creative control.

For example, Laurence is excited about researching Maya’s new AI animation tool, MotionMaker – which automates some of the tedious and repetitive work of character locomotion – as a possibility for speeding up animation. Another timesaver may come from the latest addition to the M&E Collection, Autodesk Flow Studio, which derives 3D scenes, rigged characters, and animation data from single-camera video footage. Cymet mentioned the prospect of quickly gathering reference animation from phone camera videos using Flow Studio.

“Some [AI] tools and workflows empower artists to do more,” Cymet says. “These workflows will come to the surface, and I look forward to technologies that minimize the laborious aspects of what we do and allow us to spend more time on art.”


For more details and to view an insightful breakdown reel of WeFX’s horse chase sequence from 1923, read more about their work on their website or watch the webinar recording.