Stephen Hawking’s Communications Interface Gets Its First Overhaul In 20 Years

If he inputs the word “the,” the system knows the most likely next word will be “universe.” Of course.

For 20 years, Stephen Hawking has used the same system to communicate with the outside world. Now, the 70-year-old physicist, who suffers from an ALS-like motor neuron disease (MND) and is paralyzed, is getting an upgrade from Intel.


In the pre-computer days, Hawking communicated through a painstaking process where a caretaker would go through a group of letters on a board, pointing line by line and letter by letter until Hawking signaled which he was looking for.

Then, in the 1980s, he was offered a computer interface that had similar functionality. Once he completely lost use of his hands, his caretakers installed a sensor in his cheek that signals an input every time he moves his cheek or blinks. A voice synthesizer takes his selections and translates them into speech.

Hawking also used his Windows-based system to perform functions on a regular computer–things like highlighting words, using pulldown menus, and so on. Over the years, though, it has become harder for him to use the system, both because it is now more difficult for Hawking to control his facial expressions and because the technology he used was outdated and couldn’t compensate for his declining typing speed.

Three years ago, Hawking reached out to Intel for help. At that point, his typing speed had dropped to one word per minute, making it more difficult to communicate than ever. “He wanted to be more independent and in control of his system,” explains Dr. Horst Haussecker, a senior principal engineer and director of Intel’s Computational Imaging Lab. “When we came in, we said, ‘We’d like to treat you like a scientific experiment.’ Of course being a scientist, he really liked that idea.”

Today, Hawking and Intel unveiled the new interface, dubbed ACAT (Assistive Context Aware Toolkit).

To figure out what to update in the new system, Intel’s researchers observed Hawking’s day-to-day life. His control software now understands his most likely needs and intentions, exposing the most commonly used features (like a specific pulldown menu on the computer) at the top level of his user interface.


Hawking’s typing speed is now twice as fast. His previous system had a word prediction function, but it was rudimentary at best. “It wasn’t based on sophisticated machine learning. Even your smartphone has a better word predictor,” says Haussecker. Intel and a company called SwiftKey integrated a more modern word prediction system into Hawking’s interface–one that learns from his actions and knows that if he types “the” that his next word is most likely to be “universe.”

But sometimes, Hawking just wants to have a little privacy. Since he has a cheek sensor, his old system would produce a string of random text every time he chewed during a meal. It was embarrassing, and incredibly, the system didn’t have a mute button. The new system does.

Intel also added features that Hawking didn’t even originally request, like the ability to attach email. “We discovered that he was not able to send attachments. When he wanted to, he would call one of his assistants to attach a file to his email,” says Haussecker.

While the system is tailor-made for Hawking’s requirements, Intel will open-source the code base in January so that anyone can use it and tweak it for their needs.

“One of the big learnings for us is that the system has to be very personalized,” says Haussecker.


About the author

Ariel Schwartz is a Senior Editor at Co.Exist. She has contributed to SF Weekly, Popular Science, Inhabitat, Greenbiz, NBC Bay Area, GOOD Magazine and more