What If You Could Have A Direct Feed From The Internet Into Your Brain?

By turning electronic data into vibrations, a vest could extend the reach of human perception to almost anything you can imagine.

Human perception is only limited by the sensory inputs that our brains receive. Cochlear and retinal implants, which replace the functions of damaged ears and eyes, prove that. The brain can adapt to most of what we throw at it.


On the TED2015 stage, neuroscientist David Eagleman proposed a novel kind of sensory input: a vest that transmits patterns through vibration. The vest, which Eagleman wore during his talk, is outfitted with motors that convert sound captured onto a tablet into patterns of vibration.

The first application is, unsurprisingly, for the deaf community. On the stage, Eagleman showed a video of a deaf man named Jonathan who trained with the vest for two hours every day. On the fifth day, a man said words into the tablet (like “touch” and “where”), and Jonathan was able to immediately write it on a blackboard based on the patterns of vibration.

“After three months, it’s the direct perceptual experience of hearing. It’s like a blind person reading Braille,” Eagleman said. The vest is also about 40 times cheaper than a traditional cochlear implant.

This isn’t an entirely new idea. Researchers at Colorado State University have come up with a mouth retainer that turns electrical signals into pulses that can be “read” on the tongue.

But Eagleman believes his system can go beyond applications for the deaf. “What if you could feed real-time data from the Internet directly into someone’s brain?,” he asked. He recently performed a test where a subject feels a stream of stock market data and is asked to press one of two buttons on a tablet. The subject has to figure out which button to go for–without knowing that he’s making buy and sell decisions based on stock data.

In the future, he imagines that the vest could be used to extend human perception even further. Pilots could use it to stream airplane data, like pitch, yaw, roll, and heading. Astronauts could “feel” the health of the space station.


It all sounds useful–just as long as it doesn’t add another source of constant, inane data constantly competing for our attention.

About the author

Ariel Schwartz is a Senior Editor at Co.Exist. She has contributed to SF Weekly, Popular Science, Inhabitat, Greenbiz, NBC Bay Area, GOOD Magazine and more.