Inclusive design has a way of trickling down to benefit all users, not just the ones for whom it’s originally intended. Voice dictation, for example, was originally pioneered in the 1980s as an accessibility feature; today, millions of non-disabled people use it every day through voice assistants like Siri and Google Assistant. The same thing goes for word prediction, a technology developed for people who have trouble typing on traditional computers–but which millions of people use now under the guise of smartphone autocomplete.
It normally takes years, or even decades, for this trickle-down effect to become evident. But it’s easy to imagine that Google’s new Voice Access feature won’t take nearly as long to have an impact outside of its intended audience. Announced this week at I/O 2016 as something that will ship with Android N, Voice Access is a way for people with severe motor impairment to control every aspect of their phones using their voices. But once you see it in action, the broader impact of Voice Access is immediately obvious.
Here’s how it works. When Voice Access is installed, you can enable it with Android’s “Okay Google” command by just saying: “Okay Google, turn on Voice Access.” Once it’s on, it’s always listening–and you don’t have to use the Okay Google command anymore. With Voice Access, all of the UI elements that are normally tap targets are overlaid by a series of numbers. You can tell Voice Access to “tap” these targets by saying the corresponding number aloud.
But these numbers are actually meant to serve as a backup method of control: You can also just tell Voice Assistant what you want to do. For example, you could ask it to “open camera,” and then tell it to “tap shutter.” Best of all? Any app should work with Voice Access, as long as it’s already following Google’s accessibility guidelines.
Technically, Voice Access builds upon two things that Google’s been laying the groundwork on for a while now. The first is natural language processing, which allows Google Assistant to understand your voice. But just as important is the accessibility framework that Google has built into Android. After spending years preaching best accessibility practices to its developers, every app in Android is supposed to be underpinned with labels in plain English describing the functionality of every tap and button–allowing Voice Access to automatically translate it for voice control. And if some developers don’t subscribe to Google’s best accessibility practices? Well, that’s why Voice Access has the redundancy of number labels to call upon.
During a demonstration at I/O, Voice Access seemed absurdly powerful. For example, you could tell Voice Access to open settings, scroll down to the bottom of the screen, change an option, and then go back to the home screen–all using your voice. There’s also a host of sophisticated voice dictation commands, so you can tell Voice Access to text Mary that “dinner is at 8,” then edit the time before sending the message by simply saying “replace 8 with 7.”
For the 20% of Americans who have severe to moderate motor impairment, including Parkinson’s, essential tremor, arthritis, and more, the benefits of Voice Access are obvious. These users can finally have high-level control of their Android devices without ever needing to use their hands.
But according to Google’s accessibility UX researcher Astrid Weber, there are two kinds of disability. There are the people who are permanently disabled, and then there are those who are “situationally disabled.” Situational disability can be as serious as a broken arm or as temporary as having your hands full with shopping bags. The point is that all users are situationally disabled on a regular basis, which means that accessibility features like Voice Access are for everyone–or they will be, eventually.
Voice Access’s usefulness to pretty much everyone is so evident, even driving home from I/O, I found myself wishing I had it on my smartphone. With Google Auto and CarPlay, companies like Google and Apple are making a big show of creating auto-friendly user interfaces for their smartphones. But the truth is, both platforms are extremely limited, and it still feels dangerous to be looking at a screen and tapping on it while driving. They also require special hardware to work. Being able to control your regular smartphone with your voice, on the other hand, feels natural.
As I was barreling down I-280, I pined for the ability to just tell my smartphone to put on the next episode of my favorite podcast, or open Slack and tell my editor I’m on the road but I’ll have edits in for her in 30 minutes.
Voice Access might fall under the umbrella of accessibility, but to me, it feels just as much a part of Google’s big evolution into an ambient conversational interface as Google Assistant or the recently announced Google Home. Both of those products play into the idea that you want to be able to do all sorts of generalized, low-level tasks through Google without looking at a screen: play some music, turn off the lights, or get information on the weather, for example.
Voice Access is the other half of that. It’s a product based on the idea that you’ll want to be able to use your voice to do highly specific things when you can’t touch a screen. Voice Access is the accessibility feature that fulfills Google’s conversational interface promise. Now, there’s literally nothing Google does that you can’t control with your voice.
Voice Access will be available as part of Android N when it is released to the general public later this year.