Sometimes a small tweak can make a big difference. Case in point is the “Active Edge” feature on Google’s new Pixel 2 and Pixel 2 XL phones, which allow you to squeeze the sides of the phone to call up the Google Assistant for voice commands.
At first I wrote off this feature as a gimmick. There are already plenty of other ways to conjure Google’s voice search, from long-pressing the home button to simply uttering “OK Google” or “Hey Google” within earshot of the phone. How much more useful would a squeeze be?
It turns out that within a couple of weeks of owning the Pixel 2 XL, Active Edge has changed the way I use my phone. I’m much more likely to use Google Assistant now, and all it took was a little less friction.
Barks, Buttons, and Squeezes
Accessing Google’s voice assistant wasn’t always this simple. Previous Google Pixel and Nexus phones didn’t offer a home button or other tactile way to initialize a voice command. Instead, you had to wake the screen, then either swipe on a lock screen icon or hold down the virtual home button. Android phones have also gradually added hands-free voice commands since they first arrived on the Moto X in 2013.
But neither of those approaches feel as effortless as tactile input. Waking the phone and swiping the screen takes extra steps, and wake phrases can be unreliable, especially in a noisy environment. When the whole point of voice commands is immediacy, even just a short holdup or occasional frustration can be a killer.
The Pixel 2’s Active Edge feature is more akin to the home button on an iPhone—or the side button on the new iPhone X—but I’d argue that it takes the removal of friction a couple small steps further.
With the iPhone’s home button, there’s a short delay while iOS distinguishes between a press and a hold. By comparison, the response from squeezing the Pixel 2 is immediate. Active Edge also feels slightly more natural than pressing a distinct button, since you don’t even have to position your finger to use it. To that end, Google nailed the squeeze’s haptic feedback, which really feels like you’re pressing a big pair of buttons on either side of the phone. It also helps that Google Assistant only occupies about half the screen, so you don’t feel as disconnected from whatever else you’re doing if the phone is already unlocked.
These may not seem like major differences, but they’ve added up in a way that makes me feel confident about using voice commands. Now, instead of tapping on the screen to check the weather, make a phone call, or add a to-do list item, I’ll just ask Google Assistant to take care of it.
Room For Improvement?
There are some ways in which Active Edge could reduce friction even further. If the phone is locked, for instance, Google requires a PIN or fingerprint before it answers any query. It’d be nice if this wasn’t necessary for non-sensitive responses, such as the local wether forecast or the sort of trivia you’re likely to seek via a Google search.
Some folks have also suggested that Active Edge should be customizable, so that users could open the camera or launch an alternative voice assistant instead. (There does seem to be an elaborate workaround to enable this.) I wouldn’t mind having an option for secondary commands, perhaps mapped to a longer squeeze or a double squeeze.
But then, I’ve used phones that map multiple actions to a single button—Samsung’s Galaxy phones used to assign S Voice and Google to the home button, for instance—and it ends up being clumsy in practice. Tacking on more or alternative actions would introduce complication to a feature that’s appealing in large part because it’s currently so clever in its simplicity.
Google needs every advantage it can get in the virtual assistant battle with Amazon’s Alexa and Apple’s Siri, and removing all the hassle of voice controls from its flagship phones is one way to get ahead. It’s already worked for me.