HP’s Post-Electronic Solution To Tomorrow’s Huge Data

The Internet of things will soon be spitting out more data than today's transistors can handle, but HP thinks it has a solution: The Machine.

Imagine a single device that, like the people in Honey I Shrunk/Blew Up the Kids, comes in whatever size a storyline demands. It can be the size of a server and weigh hundreds of pounds, the size of a PC, a smartphone, or a miniature sensor.

Welcome to The Machine: HP’s vision for a universal building block of the Internet of Things. The Machine is designed to operate in a world where there’s dramatically more data that’s too big to move. The device—which HP says can fulfill the role of a phone, a server, a workstation—is a big bet for HP, as the growth of the PC market continues to slow.

Huge Data And The Post-Electronic Future

Today most consumers think about gigabytes (10^9 bytes) or terabytes (10^12 bytes) when they’re buying a phone or a hard drive. But we’re swiftly moving toward a world whose data will be measured in geopbytes (10^30 bytes).

Conventional computer architectures can’t cope with this much data, says Daniel Sanchez, a professor of electrical engineering and computer science at MIT.

Their energy efficiency is horrendous, too. “Think about data centers around the world today,” says Richard Friedrich, the director of Strategic Innovation and Research Services at HP Lab, “they (collectively) consume more power than the country of Japan.” Running data at a geopbyte scale with current technology would consume as much energy as everything on Earth many times over.

To solve problems of this scale, HP looked for a microscopic solution.

For decades, computer engineers have kept making transistors (the basic unit used in processors and memory), smaller and smaller, packing ever more of them onto circuits or chips, in order to increase the speed and the output of the machines. Today transistors have become so small that the bizarre quantum properties of individual electrons (the subatomic particles which power transistors) are starting to interfere destructively.

“Electrons have this nasty issue that at a given moment you never know exactly where they are,” Christian Verstraete, chief technologist at HP’s Cloud Strategy Team, explained in a blog post. “We are doomed to reach a limit [as transistors get smaller], the moment we will no longer be sure a gate is opened or closed, as we don’t know whether it will have been hit by an electron.”

Martin Fink

The Machine solves the problem by abandoning the electron—at least for some key purposes. “Memory systems are moving to the ion,” Friedrich says. Ions (atoms and molecules that have an either positive or negative electric charge) are more stable and store information using less energy than electrons, Friedrich says, which makes them particularly suitable for tiny transistors. “It’s a very simple structure which we can make very very small.”

This simple structure is a major reason why The Machine can come in different shapes and sizes, he explained.

Ion transistors will only be able to handle the size of tomorrow’s data to a point. “And then we are going to communicate using photons,” Friedrich says. Photons are the subatomic particles of which light is made—when you transmit information using them rather than electrons moving along a copper wire, you increase the data transfer capacity of the link exponentially, while reducing the energy losses greatly.

Combining these storage and transfer increases with more specialized processing cores will allow The Machine to process more data and to use 20 to 80 times less energy than today’s servers, HP says.

Keeping It Local

80 times more efficiency won’t mean much if we’re still trying to send exponentially more data around the world. So instead, The Machine is designed to process data where it finds it. Imagine millions of versions of The Machine: A smallish one on a traffic light, a larger one on an airliner, and tiny ones amid all the devices that will make up The Internet of Things.

The Machine would process the data from all these sources where it is, without sending it across the country to another server. HP calls this distributed mesh computing.

The Machine is a big, holistic bet which HP casts in revolutionary terms. “We’ve democratized the ability to work on enormous data sets,” Friedrich says. “[Suddenly you can] do the kind of calculations on big data that today you can’t do, unless you are a government with a power plant next door.”

It’s also a play for a new revenue stream that they could own the way they used to own PCs. (IBM, a similar company in a similar situation, is making its own bet on how to solve the shrinking-transistor problem.) MIT’s Sanchez is optimistic HP can succeed. “It’s a natural evolution for HP to essentially replace main memory,” he says. But can they sell the world on it?

[Image: Flickr user Peter Shanks]

Add New Comment

0 Comments