Fast company logo
|
advertisement

Because the voice revolution shouldn’t leave anyone behind.

This clever app lets Amazon Alexa read sign language

[Source Images: Amazon (photo), Evgenii_Bobrov/iStock]

BY Mark Wilson2 minute read

The voice assistant speaker revolution of Google Home and Amazon Alexa has left the deaf community behind. It’s a two-fold problem. These devices never learned to decipher the spoken voices of people with an extreme hearing impairment. At the same time, anything Home or Alexa say in response can’t be heard by the user. Adding a screen to display information on a device like the Echo Show might help, but it can only get someone so far if they want to have a natural conversation with a machine.

Now, one creative coder has built a solution. Abhishek Singh–who you may recognize for building Super Mario Bros. in augmented reality–built a web app that reads sign language through a camera, then says those words aloud to an Amazon Echo. When the Echo speaks its response, the app hears that, then types it out. The app allows deaf people to “talk” to a voice interface using nothing but their hands and their eyes.

“The project was a thought experiment inspired by observing a trend among companies of pushing voice-based assistants as a way to create instant, seamless interactions,” says Singh. “If these devices are to become a central way we interact with our homes or perform tasks, then some thought needs to be given to those who cannot hear or speak. Seamless design needs to be inclusive in nature.”

advertisement

To build the system, Singh trained an AI in the popular machine learning platform Tensorflow, signing words at his webcam by hand over and over again to teach the system what sign language looked like. Then he plugged in Google’s text-to-speech capabilities to read the words out loud to the Echo.

Singh’s solution certainly feels a bit like a roundabout game of telephone–with one translator talking to another to get a point across. It as much solves the problem of spoken interfaces for the deaf community as it does shed light on it. Soon, Singh will be open-sourcing his own code and sharing the full methodology behind it, “so hopefully people can take this and build on it further or just be inspired to explore this problem space,” he says.

Ultimately, the best, most seamless solution would be for something like the Echo Show to cut out Singh’s own middle man, and recognize ASL all on its own. “That’s where I hope this heads. And if this project leads to a push in that direction in any small way, then mission accomplished,” says Singh. “In an ideal world I would have built this on the Show directly, but the devices aren’t that hackable yet, so [I] wasn’t able to find a way to do it.”

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

CoDesign Newsletter logo
The latest innovations in design brought to you every weekday.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Mark Wilson is the Global Design Editor at Fast Company. He has written about design, technology, and culture for almost 15 years More


Explore Topics