A string of presenters at Google’s press event in San Francisco on Tuesday took veiled shots at Apple and Apple products, eliciting oooohs and laughter from the audience, which was full of Google employees and partners as well as journalists. We don’t get a ton of drama in tech world, but we’ll take it where we can get it.
Google announced a bunch of new products, and many of them–and the specific features they offer–seemed to be direct responses to Apple products and features. Google’s presenters, however, didn’t miss too many chances to argue that Google is doing the technology better, often by leveraging various forms of artificial intelligence.
Google VP and Pixel phone product manager Mario Queiroz, while talking about the new Pixel 2 phone’s camera system, said “We don’t set aside better features for the larger device.” This was a clear shot at the way Apple reserves certain features for more expensive iPhones. This year, of course, Apple saved the newest, coolest features for its $1,000 iPhone X. And only the iPhone X and iPhone 8 Plus, not the lower-price iPhone 8, got dual cameras and the fancy new Portrait mode and lighting effects.
Google says the advanced software and machine learning in the Pixel phone obviate the need for two cameras anyway. A video about the new Pixel phones presented the slogan “So you need a second camera, right? No need.” Another clear dig at Apple.
One slide during the presentation showed the Pixel 2 connected to another phone with a white cable. Queiroz explained that “your photos, apps, and even your iMessages” could be moved from your old phone to a new one in just 10 minutes. That old phone, of course, was an iPhone. More laughter from the audience.
Queiroz announced that Google will continue to offer Pixel owners free unlimited photo and video storage, saying that Pixel users shoot twice as many photos as typical iPhone owners–23 gigabytes of imagery on average per year. His punchline: “If you had to use iCloud, you’d reach your free storage limit in about three months.”
When they weren’t verbally zinging Apple, the Google people were busy announcing new products that seemed designed to one-up recently announced Apple products.
For instance, Google answered Apple’s music-focused smart speaker with its new Google Home Max smart speaker. The Max, which has two 4.5-inch speakers and two tweeters, is larger and probably louder than the HomePod speaker, which has a single 4-inch driver and one tweeter. Both companies made a big deal out of the fact that they use special software to tune the output of their respective devices to perfectly fit the room. Both the Max ($399) and the HomePod ($349) will go on sale this holiday season.
Google also introduced its answer to Apple’s AirPods with the new Pixel Buds. Well, sort of. As with the AirPods, you can tap one earpiece to play music, get directions, and otherwise utilize the two companies’ respective voice assistants. The Pixel buds differ from AirPods in that they are connected to each other with a cable. (They do connect wirelessly with your phone.) But they do support a wider set of gestures for controlling phone functions than the AirPods do.
Where the Pixel Buds really one-up the AirPods is in translation. A wearer can, for example, speak English into the earphones and have another language issue from the phone speaker. The feature works the other way around, too, in 40 different languages, which works out to a total of 1,600 total language combinations. For now, Apple’s Siri can only turn English phrases into Mandarin, Spanish, French, German, or Italian.
The Me-Too Syndrome
Google announced a new PixelBook, which in some ways seems to target the same market as Apple’s iPad Pro (and Microsoft’s Surface, for that matter). Not only does it have a 12.3-inch touchscreen but also a pen stylus like Apple’s Pencil. Google’s pen is based on Wacom technology and has 2,000 levels of pressure sensitivity. So far, Google is emphasizing using it for note-taking and annotation rather than drawing.
The new Pixel 2 and Pixel 2 XL phones, as the well as the new Clips camera, can shoot something called “motion photos”–where the camera captures a few seconds of video on either side of a still shot to catch both the moment and the context around it. This is very much like Apple’s Live Photos, a feature introduced with iOS 9 and the iPhone 6s and 6s Plus. Google pushes it a step further by using AI to pick out which frame in the motion photo makes the best still image.
Apple introduced Portrait Mode with the iPhone 7 Plus, and now Google has added portrait mode to the Pixel cameras. In both the Pixel and the iPhone the camera detects several layers of objects at different distances from the camera. The background can then be blurred to set off the photo’s subject in the foreground.
At other times, Google seemed to sing from the same hymnal used by Apple. For one feature, in which the Pixel phone can listen to the music in the room and display on the phone’s “always on” screen, the presenter stressed that the song recognition resided on the phone and no information is shared with Google. Apple often talks about how its AI runs only on the device, and how it avoids intermingling personally identifiable information in the cloud.
At another point in the presentation, Google’s hardware chief Rick Osterloh pointed out that Google isn’t always going to be “first to market” with certain technologies, preferring to take a little longer and create something great. Anybody who has watched Apple for any length of time knows that it’s been playing the “not first, but best” game for many years.
So Apple and Google are either sharing marketing plans (probably not!) or the two companies have very similar views of the kinds of products and features tech consumers are looking for today. Based on what I saw today, it looks like Google’s plan is to offer consumers everything Apple does–and then use its deep AI knowledge to make it all work better.