advertisement
advertisement
advertisement

NASA Space-Tests A Supercomputer To Send To Mars

On Monday, a supercomputer blasts off for a yearlong mission aboard the International Space Station.

NASA Space-Tests A Supercomputer To Send To Mars
[Photo: NASA]

On Monday, a supercomputer blasts off to the International Space Station on a year-long mission to test its metals and see how it survives the rigors of space.

advertisement

Ever kill a laptop by spilling a little water on it? How about a blast of cosmic radiation? That’s just one of the hazards facing computers for scientific research that will one day travel to Mars, tens of millions of miles away from any spare parts. To gauge the wear and tear of spaceflight, NASA will launch on August 14 a supercomputer made by Hewlett Packard Enterprise on a yearlong mission aboard the International Space Station.

Unlike the other computers on the ISS, this one is not “hardened” with shielding and other provisions to survive heat, radiation, and other stresses. It was pulled right off the assembly line for HPE’s Apollo 4000-series enterprise servers.

Hardening is a must for computers, controlling mission-critical aspects such as navigation and communication, but the process limits the capabilities of computers used for research projects. “The traditional hardening takes time and money and ends up with out-of-date capabilities delivered late to the mission,” says Mark Fernandez, who manages the software portion of the tests for Hewlett Packard Enterprise. HPE and NASA want to see if a state-of-the-art, unprotected computer can survive space travel, using software to compensate for any damage.

Modern computers have software to correct errors, such as data not written correctly to memory. HPE and NASA will test whether these programs can root out and compensate for malfunctions resulting from damage in space. “So we monitor all of the environmental aspects of the server—its power, its temperature, its memory errors, its logging errors, etc.,” says Fernandez, “and when it looks like I’m having some issues, I can take corrective action with certain parameters, the most common of which would be, let’s slow the machine down and see if it can self-heal.”

I ask Fernandez if he expects any in-flight damage to a computer to be temporary, like wiping out some data, or permanent, like wiping out the drive that stores data. “That’s a very good question,” he says. “And the most honest answer I can give you is, I don’t know.” NASA and HPE want to see if a computer can survive even some permanent damage. It might run a bit slower if a processing core or some memory cells have been fried, but it could still be much more powerful and versatile than outdated hardware that went through the long hardening process.

“So we are taking the risk that the harsh environment of space will completely destroy our experiment,” says Fernandez. “That’s the point We would like to see if we can protect this unmodified-at-all hardware and software.”

advertisement

Power Where You Need It

As the distance from Earth grows, so does the need for onboard computing. On the ISS, scientists can easily beam down data they collect to be processed on larger earthbound computers. A 2016 upgrade provided the ISS with a 300 megabit-per-second connection to Earth.

Bandwidth is a lot tighter on the red planet. The Mars Reconnaissance Orbiter sends data back to Earth at between 0.5 and 4 Mbps. Then there’s the delay—at least 13 minutes for a signal to go each way, which lead to the harrowing blackout period during the Mars Rover Curiosity’s perilous descent, as well as the painfully slow conversations depicted in the movie The Martian.

“There may be scientific experiments with data analytics, and it would be impossible to send all that data back from Mars over that really slow and precious link,” says Fernandez. “If I can do some preliminary analysis on site in the spacecraft or on Mars, then I can downsize the amount of information I need to send to Earth.”

“Supercomputer” may conjure images of a room-size contraption that requires a power plant’s worth of electricity. In 1997, the first computer with a capacity of a teraflop,  a trillion mathematical operations per second, sucked up 850 kilowatts of electricity. HPE’s Apollo 4000 series, with about the same computing power, uses around 400 watts, says Fernandez. It’s a lot more powerful than a typical desktop computer, but not an uncommon piece of hardware for an advanced research lab. (The most powerful supercomputer today is 93,000 times as powerful as the HPE Apollo system NASA is testing. It also consumes 15 megawatts.)

“I know from being in this industry for a while that until you give scientists a… supercomputer, you don’t know what they’re gonna to do with it,” says Fernandez. “And you’re always surprised and excited at what comes out.”

About the author

Sean Captain is a technology journalist and editor. Follow him on Twitter @seancaptain.

More