This week, a little three-pronged spacecraft named Juno arrived at Jupiter.
No one was around to see it. Yet thousands of people watched it through an app that uses Juno’s predicted telemetry–the math the spacecraft uses to navigate–to visualize its trip in animated real-time. As the spacecraft entered Jupiter’s orbit, the visualization showed the historic moment within just one second of the real thing. For a spacecraft that’s five years away, it was a triumph of engineering and storytelling with data.
The app, called Eyes on Juno, was created by a team based in Pasadena at the Jet Propulsion Laboratory. Known as the Visualization Technology Applications and Development Group, it collaborates with scientists and engineers to visualize hard data not just for the public but often for the mission itself. For Juno, the team fed the mission’s telemetry into its visualization software and rendered it through the popular cross-platform Unity game engine. If you were watching, you could see Juno arc around Jupiter in real-time. You could replay the most exciting moments. You could explore the components of its design, and more.
It made the far edges of the solar system feel almost cozy.
The Visualization Technology Applications and Development Group is managed by Kevin J. Hussey, who started at JPL in the 1970s on a summer internship. Originally a climatic geomorphology student who also studied programming, Hussey has seen the field of digital animation rise up around his career–he left JPL only once, for an eight-year stint at Disney, before returning.
In the late ’70s and early ’80s, the processing-heavy graphics work Hussey was doing was still very nascent. His first big break came in 1981 from an atmospheric scientist named Moustafa Chahine. “‘Hey, I’ve done all this modeling of the atmosphere, but nobody will pay attention to it. What can we do?,'” Hussey remembers Chahine asking. His response? “Let’s make a map.” Soon, Hussey had established a niche at JPL visualizing the work of its scientists using emerging animation tools. “It was cartography as it was supposed to be,” he says. “The combination of art and science.”
Today, the team is exploring the outer edges of what could be called “data visualization”: a hybrid of data and creative animation that is as close as most of us will ever get to space. Interactive experiences like Eyes on Juno actually evolved out of an internal need at NASA–what Hussey calls the “seven minutes of terror,” when engineers on the ground have to await as news that a critical mission event was successful travels back to Earth. Originally, such animations based on predicted telemetry were created so JPL could see a very close approximation of what was happening in real-time. The idea caught on.
When NASA’s Curiosity Rover landed on Mars in 2012, hundreds of thousands of people watched the event as it unfolded through the team’s web app, known broadly as Nasa’s Eyes. After Curiosity’s real landing data reached Earth, it turned out that the animation was just .6 seconds ahead of the real thing. “They weren’t watching a movie,” Hussey said in an interview later. “They were interacting with the predicted telemetry,” right down to the half-second.
Visualizing the fringes of the solar system is a creative challenge, because no one really knows what it looks like. The team must make educated guesses about what a planet or atmosphere is really “like.” For example, Jupiter’s magnetosphere is obviously invisible (in the app, it spouts from the planet like ribbon), while its intense radiation belts (also invisible) are rendered in a hazy veil of colors.
When I talked to Hussey, the day after Juno’s big moment, he was already gearing up for his next big visualization project: a module for the Nasa’s Eyes app that shows the Cassini mission to Saturn. Cassini is currently gearing up for its “grand finale,” when it will hurdle through Saturn’s rings repeatedly and eventually plunge to its demise. The team was meeting with JPL mission scientists to gather information that will help them visualize the details of those rings as scientifically as possible.
Thanks to recent upgrade to the Unity 5 gaming engine, they plan to animate the rings as real particle systems, which means they’ll actually behave based on current scientific models of their physics. When Cassini edges closer to the rings later this year, the team wants its animation to be as accurate as possible, based on what scientists know about its texture and physics.
They’re also working to bring VR support to the app, and working on more advanced platforms for exploring data from Mars and a bevy of exoplanets. Will there ever be a day when we’ll watch a mission live-streamed through real video? It’s certainly possible. When NASA launches its InSight mission to Mars in 2018, the main spacecraft will be accompanied by two cubesats–small, four-inch craft that will provide support. They won’t have cameras to stream video of InSight to our phones, but it’s plausible that they eventually could.
“How long will it be that it’ll be cheap enough to literally put a camera onboard to watch the spacecraft?” Hussey wonders. One day, we may watch actual video of a Mars landing, shot by a companion craft. In the meantime, you can check out Eyes on Juno here.