When you sit your kids down in front of Netflix to watch Word Party, an animated series debuting in 2016, the last thing they’re going be thinking about is the intricacies of how it was made. And that’s good: That means that the team of puppeteers, artists, and producers behind the scenes are doing their jobs. In fact, even the most perceptive of grown-ups wouldn’t guess that many of the characters are actually digital puppets controlled by real live people–or just how dramatically this unique approach to animation has changed in recent years.
To get a clearer picture of how these characters came to be, you would have to venture to the Jim Henson Company’s headquarters in Los Angeles. It’s in this building, on the site of the historic Charlie Chaplin Studios, that shows like Word Party are shot and edited. And what you’d find here isn’t just rows of animators sitting at computer workstations (although there’s plenty of that going on), you see people wearing proprietary gloves and body sensors, acting out scenes as their motions are mapped to the movements of animated characters on a large screen.
This is what’s known as the Digital Creature Shop, a modern complement to the cloth-and-fur puppetry that made the late Jim Henson a household name.
“What we’ve created over the past decade is a toolset that’s a combination of hardware and software,” says Steffen Wild, the visual effects supervisor at Jim Henson’s Creature Shop. The conceptual underpinnings of digital puppetry go back to the late 1980s, when Henson himself experimented on the clunky, expensive machines available at the time. But 25 years ago, the technology just wasn’t up to the task of executing Henson’s vision for digital puppets, so the company put its efforts on hold for while.
Over the last decade, the speed of processors and networks have caught up to the needs of modern Muppets. Using their own homegrown gaming engine and a hardware setup that combines wearable sensors with a three-camera rig, the JHC team is able to turn bodily movements into on-screen animations in real time, bringing these digital puppets to life.
As you might imagine, this process generates quite a bit of data. And those terabytes of footage need to be moved around the studio as quickly as possible–from the cameras to the editorial workstations, to the post-production machines and throughout the entire production workflow. Once upon a time, transferring this data across the studio was a slow, tedious process. A 30-minute episode of an animated television show could take six months to produce. But recently, that changed.
Thanks to recent infrastructure upgrades, the Jim Henson Company was able to cut down production time from six months per episode to as little as six weeks. The updates, executed as part of a partnership with data and networking services company Brocade, enabled JHC to upgrade its internal networks to a more modern fabric computing infrastructure.
“The core technology needs to be really watertight so there is no downtime,” Wild explains. The new network and related infrastructure make it easier to shoot multicamera digital puppetry scenes in real time, but it also has benefits that seep into the rest of the production process: Footage can now be downloaded at warp speed, and multiple editors can work on a scene more easily than ever before.
For the Jim Henson Company, the time savings is enormous.
“Most of the time goes back into the creative process,” says Wild. “If we can save an hour in the process by eliminating technical steps that we had to do before, that hour is now being used for creative time for the directors.”