advertisement
advertisement
advertisement

The Monster Supercomputing Achievement That Lights Up Disney’s “Big Hero 6”

Disney’s bleeding-edge rendering software Hyperion enabled Big Hero 6 animators to simulate the behavior of light in a fraction of the time.

The Monster Supercomputing Achievement That Lights Up Disney’s “Big Hero 6”
[Photo: courtesy of Walt Disney Studios Animation]

The November 7 arrival of Disney’s Big Hero 6 is as much a supercomputing triumph as it is an animated feast. It’s the coming out party for Hyperion—a cutting-edge light rendering software shaped by both Disney artists and engineers working in concert for two years.

advertisement

The system hauled 200 million computing hours, spawned a number of companion programs, and enabled animators to imbue the film’s fantastical settings, elements, and characters with a realism and dimension that would have otherwise been impossible.

“It allows us to put more on the screen with the same number of artists, creating a richer world that better supports the story,” says Disney Animation chief technology officer Andy Hendrikson.

Disney Animation CTO Andy Hendrickson planned on a physics career before being lured by animation.Photo: Ricky Middlesworth/Disney

The Japanese influenced film—directed by Don Hall and Chris Williams from an obscure Marvel property—chronicles of a group of social misfits and ingenuous robot as they attempt to save their city of San Fransokyo (a mash-up of San Francisco and Tokyo) from an evil technologist. Hyperion was also used for the Disney short, Feast, that will precede BH6 screenings during its theatrical run.

San Fransokyo at night.

How it Works

Hyperion tracks how light rays bounce off multiple objects in an environment before they enter your eyes. When a ray hits an object with a diffuse surface–say, something with a satin sheen–it scatters in many directions. Each of the scattered rays continue bouncing off other objects until they all ultimately lose energy and dissipate. The result is softer, diffused light, more nuance and shadows, creating a more realistic look.

But getting there is a unique kind of hell. Tracking those hundreds, possibly millions, of rays in all directions would overload computers’ random access memory (RAM), which stores quick-retrieval data during operation.

Early Disney research, with the universities of Utah and North Carolina, into direct and indirect light bounce simulation.

“Before, we’d been hampered by the level of computation needed at any one time, the amount of memory needed to store it, and the cost,” says Hendrickson. “Ten years ago, the level of investment would have been greater than the cost of two to three films. Hyperion allowed a 100-fold increase in image complexity for a fraction of the [undisclosed] cost.”

advertisement
(L-R) Hyperion’s development was overseen by technical supervisor Sean Jenkins, principal software engineer Brent Burley, and look development supervisor Chuck Tappan.Photo: Ricky Middlesworth/Disney.

Hyperion solves the storage and computational problems by condensing the number of algorithms to be calculated in a given amount of time, while increasing the number of light rays traced per second. The software winnows the light scatter into six main directions. As each ray hits another object, the program divides that scatter into another six directions, and so forth, until the light rays disappear. Each time a ray hits and refracts off an object, it’s called a bounce. Hyperion carried out most of its scattered light vectors to 10 bounces.

From there, Hyperion applies another set of algorithms, called filtering passes, that interpolate the additional rays that, in reality, would have occurred between the six directions, giving the appearance of infinite light bounces and a more nuanced glow. Without Hyperion, animators would have rendered this manually, which, given the image complexity in this movie, would have been impossible in the allotted production time.

Ten bounces was also crucial for the translucent look of one of the film’s main characters, a soft robot named Baymax. “If we kept it to two bounces, he ended up looking like hard plastic,” says Hendrickson. “But 10 light bounces taking place inside his body, before the light finally emerged, gave him his translucence.”

Disney Animation increased its processing power by connecting four rendering farms–three in Los Angeles and one in San Francisco–into a giant supercomputer of nearly 4600 computers running 55,000 cores–microchips containing CPU processors. (By comparison, Frozen used 26,000 cores). The division’s previously developed automated management system, Coda, guided the information flowing between the four rendering farms, enabling it to process 400,000 rendering jobs overnight for the next morning.

Carnegie Mellon University’s soft robotics research inspired the robot character of Baymax (shown at right). At left, the inflatable soft robot arm developed at CMU’s Quality of Life Technology Center uses levels of force gentle enough to safely interact with humans.Photos: QoLT/CMU, Disney

People, buildings, trees, and microbots

Complementing Hyperion, Disney coders created additional automation software to expedite and enhance other areas of rendering. One process extrapolated algorithms describing San Francisco and Tokyo architectures to create city buildings. Another, code named Denizen, created 16,000 unique city occupants by recombining portions of thousands of different base characters in a variety of textures, fabrics, and colors. (This is different than crowd simulation software that simply replicates 10 body types.) Another in-house program, Bonzai, automated the creation of the city’s 250,000 trees. The process involved interspersing the building and rendering layers of trees, people, and architecture.

“It was easy enough to use so the animators could make avatars of themselves and place them in the city as extras,” says Hendrickson. “So a lot of folks in the studio are also in the movie.”

advertisement

Still another custom process directed the movement of the microbots, spindle-shaped micro-computers—some 20 million onscreen in a given shot—that both swarmed in tandem and hurdled over one another in circuit-like waves, eventually organizing into structures. (This is more algorithmically complex than the flowing motion of the iconic birth of Sandman scene in Spider-Man 3.)

Examples of where “microbot” robotic swarm technology stands today.

Co-Creating with Animators

Disney Animation embarked upon the BH6 story four years ago, but only started working on Hyperion two years ago, gleaning constant input from a user base of 400 artists and programmers. Sometimes, the artists would ask the programmers to attempt a tool enabling them to realize a certain look; other times, the programmers achieved a simulation that gave the artists new ideas for visualizing the film.

“It’s pretty aggressive building something up that fast, but we were trying to make a world that supported an awesome story,” says Hendrickson, who hopes the collective software will facilitate a film a year. “We were exploring the art and algorithms simultaneously, riffing off each other, building the renderer at the same time they were using it to make art. But without these methods, we couldn’t handle the complexity of this world. We consider our in-house software to always be in beta, because we want it to always be evolving.”

P.S. When you see the film, stay till the very end, past the credits.

About the author

Susan Karlin is an award-winning journalist in Los Angeles, covering the nexus of science, technology, and arts, with a fondness for sci-fi and comics. She's a regular contributor to Fast Company, NPR, and IEEE Spectrum, and has written for Newsweek, Forbes, Wired, Scientific American, Discover, NY and London Times, and BBC Radio.

More