For the past 15 years, movie production studio Laika has been a pioneer in animation by seamlessly blending the tactical artistry of stop-motion with technological advancements including CGI and 3D printing.
Now the Oregon-based studio is at the forefront of another leap by incorporating AI in the making of its films.
During production of its Oscar-nominated film Missing Link, the technology and visual effects teams developed a prototype of an AI tool in collaboration with Intel.
The problem to solve: lines.
The puppets used in Laika’s films have a wide range of facial expressions created en masse by 3D printing. In Missing Link alone, more than 106,000 faces were printed in pieces (i.e., lower and upper parts of a face). It’s a useful process in controlling how a character’s face moves, but it also creates a visible seam that shows where the various lower and upper pieces connect.
“It was a huge challenge because when you go to the movies and you’re watching character performances, you are looking at the characters’ faces,” says Steve Emerson, visual effects supervisor at Laika. “You’re looking at their eyes, and that’s precisely where we were going to be doing this cosmetic work.”
Emerson joined Laika 12 years ago for the production of Coraline, and on that movie and since then, his team has relied on the very manual process of RotoPainting.
Much like Photoshop, RotoPainting involves isolating and removing unwanted items from a scene, such as puppet rigs or seam lines. While effective, it’s also incredibly time consuming.
“We want to try to give the animators as much leeway to create a natural performance onstage,” says Jeff Stringer, production technology director at Laika. “That means we have to use a lot of rigs. The 3D printing of the faces was all about giving them a natural performance that you couldn’t do with the mechanical face. So with almost every one of these innovations in our filmmaking, there comes a cost to the Roto team.”
On Missing Link, for example, there were 136,800 frames in the film. With manual RotoPainting, it would take a day to clean up around 50 frames.
“You’ve got a lot of great tools for tracking parts of the frame and painting things out, but it is still frame by frame,” Stringer says. “So it’s one of the things that was just begging for acceleration.”
With Laika and Intel’s AI prototype, they managed to knock that down by 50%.
The initial testing was done on the Missing Link character of Lionel (Hugh Jackman) and had a pretty solid success rate: Out of the 50 shots they processed to remove seam lines, 35 came back that were usable or only needed minor touch-ups. Although the AI tool set wasn’t used in the final version of the film, Stringer says they’re continuing to test on different characters to build a robust data set for machine learning in hopes of utilizing it for the studio’s next film.
They’re also exploring the broader use cases of AI in RotoPainting to remove more than seam lines (e.g., puppet rigs).
“If this is ultimately successful, we’ll probably end up with a data scientist person on staff who’s working with early data sets from all the characters—and building those training models,” Stringer says.
While innovative and time-saving, the use of AI in stop-motion films could rankle purists of the art form. Laika has been no stranger to employing technology, but are their advancements in AI pulling them farther away from the specific artistry of stop-motion?
For Emerson, it comes down to what serves the story best.
“I’ll give you an example, an explosion with a bunch of smoke,” he says. “You want to animate smoke one frame at a time, you’re going to have to go out and get some sort of a material like cotton balls. If you’re an audience member that’s watching a film and there’s an explosion and it’s a bunch of cotton balls, it’s gonna look really cool. And it’s fun. Believe me, I love that type of filmmaking. But there’ll be a split second where it’s going to take you out of it. And you’re going to recognize that it’s not smoke—it’s cotton balls.”
Emerson stresses that the animators try to do as much in-camera with rigs and sets as possible. But they’re not afraid to chase digital solutions if need be, including AI. The ultimate goal with something like their prototype to speed up RotoPainting is to spend less time on the minutiae of production and more time on figuring out how best to present a story.
“There are two types of work that we do here in visual effects: There’s the objective work and there’s the subjective work. The objective work is there is some kind of a crazy rig in the frame that needs to go away. There is a line that is going across the puppet face that needs to be removed,” Emerson says. “The subjective side of it is, is the frame beautiful? Is it exciting? Is it unlike anything that I have ever seen before? So a lot of what we’re chasing here is the ability to be able to move past the objective stuff. Hopefully we can leverage AI and machine learning to help us, and then we can focus more on the subjective and really try and make these images look unlike anything anybody has ever seen before with stop-motion films.”