What is Google? It’s the impossibly large, sprawling infrastructure company. It’s a search bar, sure, but it’s Docs, Hangouts, Doodles, the Play Store. It’s also countless products and services that we never see unless Google sends us to them.
But now, Google is changing the way that we discover exactly what Google can do. How? It’s basically deconstructing Google Search itself into a bunch of mini apps.
Just look below the Google.com search bar in a mobile browser—or inside the iOS and Android dedicated Google apps—and you’ll notice the update.
Now, the page includes what Google is calling “shortcuts,” which are tappable icons that can take you right to the weather, sports scores, or restaurants around you, all without making you type.
“Getting up-to-the-minute info is as easy as a single tap,” promises Google in the feature’s announcement. And sure, at first glance, it’s a sheer time-saving play. Say 20% of people come to Google for just a few common topics like movie times and sports scores. Why not link those topics as tappable queries, rather than forcing us to type it all out again and again?
However, Google’s Search page is the crown jewel of the company. While it’s optimized for maximum speed and efficiency—consider how Google autocompletes the question you ask before you ask it—changes aren’t made to it casually, meaning these four app-style icons are a very big deal to the company and likely signal important design strategy within Google, especially as these icons are essentially opposite of the strategy it’s been following for years—predictive AI.
Take Google Now. For those who haven’t used it in the Google App or integrated on Android, Google Now is essentially a list of index cards, always updated by AI, that tries to be proactive in offering what information it thinks you need to know (rather than the reactive offerings you get from Search). Google Now can be downright psychic in what it suggests. It will pull the scores of your favorite teams without you specifying them, and scan your morning commute for traffic down to the time you usually leave. Learning from your patterns and preferences, Now offers the information you’ll probably want without you even asking for it.
Google’s new shortcuts are a different approach to the exact same thing. Whereas Now would let you know traffic, weather, and sports through predictive AI magic, shortcuts break out these topics into buttons. And while the information looks pretty much the same, Google is having the user ask for the information instead of supplying it before they even make a query.
But why? Why hide the answer Google already knows we want to ask behind an icon? It seems that Google is capitalizing on habits we users have been learning since Apple launched the App Store. Specifically, rather than Google using AI to give answers before we ask the question, Google is offering the “app” to find the answer we want, before we ask it.
Perhaps consumers prefer the agency involved in tapping or searching to get information, rather than Google just serving it up algorithmically. We’ve already been trained by iOS and Android to click an app when we have a question: “What are my friends doing?” is Instagram or Facebook, while “what is the news?” could be CNN or Twitter. “What should I eat?” is Yelp or Grubhub, and “what should I do?” becomes Netflix or CandyCrush.
In the mobile world, the app is the answer to your question. And as a result, it makes sense for Google to translate its deep list of products into the language of apps across its mobile search platforms.
However, there’s another benefit to Google’s new approach, and you see it more clearly in its more robust Android app version of shortcuts. There, Google lists out not just your go-to weather and dining options from Google Now, but several other categories of information, including “Tools” like Google Translate, a rollable set of dice, a coin flipper, and a virtual level.
Sure, there’s a novelty to some of this, but Google has essentially broken out many of its esoteric services into an entire page of Google “apps,” full of functionality you might never know existed any other way. It’s a problem the app industry calls “app discovery,” and for Google, which has web apps that aren’t even in any formal app store, it’s an even greater problem.
It’s very easy to imagine how Google could combine the two approaches to leverage its AI prediction models with these shortcut apps, making them even more integral and adaptive to your life. Shortcuts could (theoretically) be displayed at just the right time to solve more specific, complex problems in Search. For instance, that Google level is kind of cheesy with no context! But imagine if you were hanging pictures, Google recognized this, and as you went to search for some sort of solution, the level was right there—just waiting for your finger press. (It’s not total science fiction. iOS already guesses the apps you’ll want at certain times of the day, and offers them under Siri App Suggestions. And Google Now already knows when you’ll be taking the train home each day and might want Google Maps handy.)
Google will always be a company with too many tentacles to count, meaning consumers will never know what its services can really do. And the world already has too many apps to count, spread across countless icons on our screens. But maybe, when you smash these two problems together, it’s a workable solution—at least until our brains are simply hardwired into the Googleplex.