NASA is taking its cue from the gaming world.
An eight-year collaboration between NASA’s Jet Propulsion Laboratory (JPL) and Microsoft is bearing fruit in spacecraft design and space exploration. JPL scientists helped adapt the HoloLens mixed-reality platform to space engineering needs—such as integrating body and hand gesture commands and a tracking system—before starting on mission specific software nearly three years ago. Mixed-reality overlays immersive virtual images onto the viewer’s actual environment.
Through operation utilities known as OnSight, ProtoSpace, and Sidekick, the first researchers and missions to benefit are scientists guiding the Curiosity Mars Science Laboratory, International Space Station astronauts, Mars 2020 Science Laboratory, Europa, and the French-helmed Earth-focused Surface Water Ocean Topography (SWOT).
“If we build the right interfaces that allow people to naturally interact with environments, we can unlock those abilities in ways that allow us to accomplish new things in space exploration,” says Jeff Norris, JPL project lead for OnSight and Sidekick.
This proprietary software enables scientists to “work on Mars” together from different locations around the world. In use since last summer, it reconstructs a 3-D version of the Martian landscape through collective image data from Curiosity cameras and satellites orbiting Mars. Users use specific gestures to select commands from virtual drop-down menus appearing in their headset displays, to teleport to different locations in the area. Currently, the technology is helping the Curiosity pilots decide where to drive the rover, which features to study in more detail, and have a better sense of being in the field.
“It’s really about hypothesis generation,” says Norris. “It enables them to take in at a glance and understand more about the surroundings of the vehicle than they can by looking at two-dimensional pictures on a computer screen. It will help them to chose measurements to test their hypotheses and make faster conclusions.”
“Eventually you’ll be able to go from site to site and be able to compare sites, and that’s really hard to do now,” says Luther Beegle, a surface sample system scientist with the Curiosity team. “Unless you start opening up all these images on your computer, and that gets you sidetracked.”
OnSight technology is also being used for Destination: Mars, an interactive, mixed reality experience debuting at the Kennedy Space Center Visitor Complex this summer. The experience gives the public a chance to walk on Mars with astronaut Buzz Aldrin as the holographic tour guide (see below), and JPL engineer Erisa Hines explaining what NASA has learned about the planet so far.
This software creates life-sized, 3-D spacecraft design holograms to aide the building of these craft. It’s currently being used to design and assemble spacecraft for the Europa, Mars2020, and SWOT missions.
Computer aided design is limited in depicting size, scale, and physical position. “Many of our spacecraft wouldn’t fit in this room, so fitting them onto a computer screen, you’re losing a lot of that knowledge and intuition about the size of things,” says Norris. “Although you can spin the image around with mouse or keyboard, it’s not responding to you as you move your head around it. And that’s how you understand the shape of things in the real world.”
The software enables a constantly evolving design, and can link people from around the world, so everyone is looking at the same virtual model and each other’s avatars. The program can simulate environments around the vehicle, so teams can rehearse steps during particularly complicated assembly processes.
“Now you’ve got a globally distributed team of spacecraft designers, all having a design discussion as if they’re in the same room together, about a piece of hardware that doesn’t exist,” says SWOT mechanical engineer Andy Etters. “We feel that’s a transformational opportunity.
“The more complicated the hardware and tests we have to do, the more risk is involved in the design process, so this is about mitigating risk,” adds Etters. “Can we identify risk early, before we have to spend a million dollars and delay a launch, damage it in the process of moving it around, or have it fail after launch?”
Successfully tested last February, this application assists astronauts performing tasks aboard the International Space Station (ISS), reduce crew training requirements, and increase efficiency of working in space. Until now, crew have relied on written and vocal instruction to operate devices. Sidekick can work alone—instructing users through augmented holographic illustrations displayed on top of the objects with which users interact—or enable a ground operator to see what a crew member sees and provide real-time guidance.
“This capability could lessen the amount of training that future crews will require and be an invaluable resource for missions deep into our solar system, where communication delays complicate difficult operations,” says Norris. “It’s also a step toward holographic computing in space exploration.”