Arnav Kapur wants to be very clear on something: What you’re about to see is not mind reading–even though it really looks like it.
AlterEgo is an AI-enabled headset device that Kapur, an MIT Media Lab researcher, has been developing for the past several years and demonstrated for the first time on stage at TED 2019 in Vancouver last month; you can now watch the video of the device in action below.
The small headset is able to detect, via powerful sensors, the signals the brain sends to internal speech mechanisms, like the tongue or larynx, when you speak to yourself. Imagine asking yourself a question but not actually saying the words aloud. Even if you don’t move your lips or your face, your internal speech system is still doing the work of forming that sentence. Your internal speech muscles like your tongue are vibrating in accordance with the words you’re thinking in ways that are very subtle and almost undetectable. “It’s one of the most complex motor tasks we do as humans,” Kapur says. AlterEgo picks up on the internal vibrations and transmits them to an AI embedded in the device, which translates them into language.
Kapur just won the $15,000 Lemelson-MIT Student Prize for the device, which is not available yet commercially. But the use cases are endless: You can ask the device a question, and, through a separate part of the headset that uses bone-conduction audio, the AI can identify and transmit an answer back to you that you can hear in your head but is inaudible to an observer. While Kapur was giving his TED talk, another man named Eric was on stage with him demo-ing the AlterEgo device. Silently, Eric asked what the weather was like in Vancouver at the moment. AlterEgo picked up the question, then delivered an answer back to Eric via bone-conduction audio: it was 50 degrees and raining.
You could also, Kapur says, connect the device via WiFi to a smartphone, and silently dictate text messages or commands to smart-home assistants. Or you could ask AlterEgo to solve math problems for you, or act as an in-head translator for foreign languages. Eventually, he says, AlterEgo could potentially store things you need to remember, like grocery or to-do lists, that you could later access through the AI.
Even though all this happens in a way that’s all but undetectable–save for the small headset–it’s not just reading your thoughts; you have to consciously decide to use it. Kapur likens it to typing on a keyboard. “When you type on a keyboard, that action is a type of mind-reading–the keyboard is reading an electrical signal your brain sends out through your fingers,” Kapur says. And the ready access to information is, in essence, no different than how we interact with smartphones to check the weather, communicate, or solve problems. Kapur thinks of AlterEgo as bringing AI into closer communications with humans, augmenting our capabilities in an unobtrusive way.
Just like with any technology, there are potential significant security concerns here: The AlterEgo can store information you speak to it, so if you’re someone who readily forgets your Social Security number, for instance, you may not want to depend on the device to access it.
But the implications are enormous, Kapur says, for the millions of people who struggle with speech, like those with ALS, oral cancer, or stroke. Kapur met with Doug, a man who was diagnosed with ALS and could not use verbal speech: he communicated by using head movements to select letters, a process that takes several minutes to form a single sentence. Wearing the AlterEgo, though, he could relay what he wanted to say through a synthesizer that reads out the words as he thought them in real time. One of the motivating factors for Kapur, in developing AlterEgo, was to return control and ease of verbal communication to people who struggle with it.