Smart thermostats like Google’s Nest report the temperature and can be controlled online. Fitness bands like the Fitbit Surge beam your steps, heart rate, and other fitness data to a smartphone and on to the cloud. Machines ranging from car engines to power plant turbines have sensors that measure things like vibration and temperature to ascertain if they are operating properly and predict if one’s headed for a breakdown.
It’s easy to imagine a world in which every gadget is connected to the Internet of Things. Then what?
Ever-more devices spewing data into the cloud are going to swamp our capacity to collect and analyze the information, says Edouard Rozan, cofounder of Berlin-based startup Teraki. “The networks are going to collapse because they are not ready to handle such an amount of data,” he says. “The data center to process the info is a huge investment . . . and a lot of times we have to take [action] in a very quick time.”
Not all connected devices are hooked up to fast networks, either. Low-power, wide-area networks conserve energy by putting strict limits on data transmission. One technology, Sigfox, might limit a sensor to transmitting 144 data packets per day. Up to 5 billion devices might be on such networks by 2022, according to research firm strategy analytics.
They could include sound sensors in cities, which are used to monitor traffic. A network of 100 sensors would generate about 1 terabyte of data per day, the company reckons. An offshore oil platform also spews about a terabyte each day.
Rozan says Teraki can address this problem by reducing data flow from sensors up to 90%. When I ask Rozan if he’s talking about compression, treating something such as temperature readings like the musical notes that get squeezed into an MP3 file, he says that’s not efficient or fast enough. Compression requires a lot of computing to crunch the data for transmission, and then more to unpack it at the other end. That’s a power drain for sensors in fitness bands or around a city that runs on batteries, and the compression-decompression process introduces a dangerous delay for things like collision-warning systems in cars. “It is not about realtime,” says Rozan. “It is about, I have to advise you five seconds ahead of time.”
While Teraki isn’t taking an MP3 approach, the company is, in a sense, treating data like music. Sensor readings aren’t simply streams of bits; they are sets of frequencies. If speed readings from a car’s wheels or the temperature readings from its engine didn’t change at all, there would be no point measuring them. But there’s a limit to how much they can vary before the car falls apart. Between the two extremes is a range of frequencies. And mathematicians have long had tools for describing complex frequencies with simpler components.
Teraki uses one called frequency decomposition. In the audio realm, this process breaks the complex sounds in voice or music into a sum of simple, pure tones known as sinusoids. Teraki does something similar with “music” of the sensors, transmitting just a quick synopsis of the different frequencies to a receiver that can reassemble them into the complete signal—or close enough. Rozan claims that final data won’t be off by more than about 0.2%. If a download of Microsoft Office were off by that much, the program might not even run. But that’s well within the margin of error for a sensor. “If you measure this room today,” he says, gesturing around the conference room in Berlin, “and there are two sensors, they will give you two different numbers. One percent difference, for sure.”
In test results that Teraki has published, it claims to have achieved a 33% battery saving over using traditional compression and a latency (delay) of 25 milliseconds, versus 580 milliseconds for handling the same data with compression.
Rozan manages the business side of Teraki. The mathematics comes from cofounders Markus Kopf and especially Daniel Richart. From 2008 through 2013, Richart did his PhD work at the Max Planck Institute of Quantum Optics in Munich under Nobel Prize-winning atomic physicist Theodor W. Hänsch. Richart worked on projects in quantum computing, an infant technology that uses the multiple possible quantum states of a particle to manipulate all possible combinations of data simultaneously.
“You have to manage a huge amount of data,” says Rozan. “All the photon states, they are generating so much information, that’s why they decided to start research on how to reduce the amount of information to avoid this bottleneck of data.” Richart realized that this tool for doing his work in Quantum mechanics could be a business unto itself.
It’s not just a neat idea: Teraki’s funders include the Technical University of Berlin and the German government. It works out of hub:raum, an incubator run by Deutsche Telekom. Teraki is already implementing its technology with a major carmaker. Rozan asked me not to repeat which one, but Teraki is based in Germany, so the options are plentiful and high caliber. “We are doing a real implementation on a real car,” he says. The car’s sensors can generate around 250MB of data per hour, says Rozan, and Teraki reduces that to about 10MB per hour, losing accuracy by only 0.01%, he claims. Rozan says that Teraki is also “in conversation” with a major chipmaker’s IoT division.
In addition, Rozan has been talking with an auto parts manufacturer that uses an ultrasound sensor to inspect molded plastic components for defects. That single sensor generates about 3GB of data every minute. Over a month, Rozan reckons, that would require about €6,000 ($6,525) in data storage. “We can reduce from €6,000 to €600 the cost of the storage,” he says.
Teraki’s technology isn’t static. While the sensor side of the operation is all about saving power, the servers on the receiving end run artificial intelligence machine learning algorithms to judge the efficiency of the data abbreviation and develop better optimized methods. (The company’s name is a combination of tera, as in terabyte and KI, representing the German term for artificial intelligence, künstliche Intelligenz.)
Teraki’s process is all implemented in software; it’s not burned into silicon chips. So the algorithms can be constantly updated. And because the data is encoded in a method that keeps evolving, Teraki is essentially encrypting the data it transmits.
For Rozan, it’s a foregone conclusion that the Internet of Things is only practical if someone can get a handle on the data explosion.
“If we don’t put any intelligence at the source of the information,” he says, “it is going to collapse everything.”