advertisement
advertisement
advertisement

Apple Explains How It’s Making Siri Smart Without Endangering User Privacy

Critics say Apple is depriving the AI that powers Siri of the personal information it needs to be an effective assistant. Apple execs say that’s a false narrative.

Apple Explains How It’s Making Siri Smart Without Endangering User Privacy

I wake up and put on my Apple Watch, and while I’m making coffee, I look down and see that traffic could be gnarly on the way to work, so I move a little quicker. I also see a reminder of a late-morning meeting, then a mid-afternoon flight down to L.A. As I leave the house, I glance down and see that the weather will be clear and sunny. At the airport, my boarding pass will show up on my wrist. Over the course of the day, between these little assists, my Watch shows me a few photos of my new niece.

advertisement

That’s what a day with the new Siri watch face (in the latest watchOS 4) might look like. It’s a more personalized and proactive version of Siri, and one that might show up in other Apple OSs and devices. Siri is slowly using more artificial intelligence to behave more like a human assistant who knows enough about you to give you helpful little nudges and reminders at the right times during the day. In theory, the more the assistant learns about you, your habits, and your habitat, the more insightful and helpful those little assists can become.

Siri on the Apple Watch

Most of the big tech companies are now developing their own personal assistants in one form or another. To feed them, many tech companies tend to vacuum up as much of your data as they can from your various devices and the cloud services you use. They then use powerful cloud computing and machine learning to combine and analyze that data. This allows them to connect a lot of dots about your habits, intents, and preferences. Those learnings can then be used to offer helpful and insightful assists. For instance, knowing your dining history, location, time of day, and other data, a digital assistant might suggest a well-liked pizza place that’s not crammed with customers.

But in Silicon Valley’s growing AI war, one narrative that’s emerged over the past year says that Apple’s AI efforts lag behind the work of other big tech companies in part because of its dedication to protecting user data. Among the big tech companies, Apple has taken a hard line on privacy and has tried to resist any collection of personally identifiable user data in its servers. The company has repeatedly argued–sometimes in court–that your personal data should be kept private, untouched by the police, advertisers, or even, in most cases, by Apple itself.

By not sending users’ personal data to the cloud, it’s been argued, Apple may be hindering Siri’s potential, starving the AI models it depends on of the personal data needed to more personalized and informed assistance to users.

Apple has been relatively silent on that narrative. But several members of Apple’s AI and Siri teams who recently spoke to Fast Company said user privacy and smart AI are not competing principles.

“I think it is a false narrative,” said Greg Joswiak, Apple’s VP of product marketing. “It’s true that we like to keep the data as optimized as possible, that’s certainly something that I think a lot of users have come to expect, and they know that we’re treating their privacy maybe different than some others are.”

advertisement

Joswiak argues that Siri can be every bit as helpful as other assistants without accumulating a lot of personal user data in the cloud, as companies like Facebook and Google are accustomed to doing. “We’re able to deliver a very personalized experience . . . without treating you as a product that keeps your information and sells it to the highest bidder. That’s just not the way we operate.”

How Siri learns—and how much personal data it needs to be effective—is of utmost importance to Apple: Future updates to Siri will give it an increasingly central role in our interactions with all kinds of Apple products.

Craig Federighi, the company’s senior vice president of software, wrote in an email to Fast Company that “Siri is no longer just a voice assistant . . . Siri on-device intelligence is streamlining everyday interactions with our devices.” Apple teams have “worked to make it a core part of all of our platforms”—iOS, MacOS, tvOS, watchOS, and HomePod.

“With the software update coming this fall, users will experience even more Siri functionality, and in the years to come it will be ever more integral to the core user experience on all of our platforms,” Federighi said.

AI On The Device

Like its rivals, Apple carries out a lot of fancy processing and machine learning tasks on data the user speaks or types. The majority of it–especially tasks that involve very personal information–happens on the device, locked away from the view of Apple or anyone else but the user.

advertisement

They are erring on the side of privacy by doing a lot on the device, and not so much in the cloud as other people do,” says Creative Strategies research analyst Ben Bajarin. “It’s just a very different approach than the way Facebook, Google, and, to some extent, Amazon, are going about it.”

When you speak a request to Siri, your iPhone’s software strips the request of any reference to the User ID and gives a random request ID. It’s then encrypted and sent up to the cloud for more voice recognition to identify the words, and for natural language processing to understand the meaning of the words.

If the words happened to be, “Hey Siri, make an appointment with Bob next Wednesday at 3:30 p.m.,” Siri running on the server would send down a generic directive to add the appointment to your calendar. But it’s the client device that reassociates the command with the user, adding the appointment to the calendar app running on the device where all the user’s personal calendaring information resides.

Other Siri tasks are performed independently of the cloud. When a user says, “Hey Siri, find all photos of my wife from September,” the iPad or iPhone’s software relies on facial recognition computations done on the device itself in order to find the right images. (Since last year’s release of iOS 10, the Photos app has used machine learning models on the iPhone to recognize people, places, and things in the photo library; in the soon-to-be-released iOS 11 and macOS High Sierra, the Photos app will leverage still more personal data.)

“Your device is incredibly powerful, and it’s even more powerful with each generation,” Joswiak said. “And with our focus on privacy, we’re able to really take advantage of exploiting that power with things like machine learning on your device to create an incredible experience without having to compromise your data.”

The graphics and other specialized chips that carry the computational load in Apple’s devices have been steadily increasing in computing power and speed.

advertisement

Apple benefits greatly from the fact that it controls both the hardware and software involved in on-device machine learning computations. One Apple executive told me that the company’s hardware and software engineers spend a lot of time optimizing its machine learning software to work better with the client device’s processors and sensors. Such integration helps Apple phones and tablets manage the considerable computing load of functions like natural language analysis and image recognition. It’s also been reported that Apple is working on a dedicated chip to handle all kinds of artificial intelligence computations.

What Goes To The Cloud

Apple trains its AI models on the cloud, but without identifiable user data. For instance, Apple may use a third-party training image set to teach an AI model how to reason out whether an image in the Photos app is a tree or a fishing pole or a woodpecker.

At the heart of Siri are Apple’s natural language and voice recognition models, which allow Siri to recognize the words users say, and their meaning. In some cases, teams use the audio of users’ voice requests as training data—all anonymized, Apple says.

“We leave out identifiers to avoid tying utterances to specific users, so we can do a lot of machine learning and a lot of things in the cloud without having to know that it came from [the user],” Joswiak said. In other words, Siri can learn things about users as a whole without tapping into individuals’ personal data.

Apple holds on to six months’ worth of the user voice recordings to teach the voice recognition engine to better understand the user. There are plenty of voice requests to choose from. Siri is used by 375 million people every month and is available in 36 countries and 21 languages. Apple has even built models that specialize in helping Siri understand the utterances of people who speak English as a second language.

advertisement

After that six months, Apple saves another copy of the recordings, sans user ID, for use in improving Siri, and these recordings can be kept for up to two years. The audio of requests about music, sports teams, and players, and businesses or points of interest are also kept to train Siri, Apple says.

(Apple has also begun implementing a new technique to anonymize data called differential privacy, using it in iOS 10 to, for instance, improve emoji suggestions and in iOS 11 to help Safari detect sites that auto-play content and block that content; but Apple says Siri doesn’t currently rely on that technology.)

Anonymous user request data help train the AI models that let Siri suggest apps to you, detect events in messages and add them to your calendar, and present relevant news items based on your interests. The training happens on Apple’s servers, but the models only start practicing what they’ve learned when they’ve been deployed to your device.

Once on the device, the models begin to run computations on things you type or tap into your device, or on things that are seen in the device’s camera, heard through the microphone, or sensed by the device’s sensors. Over time, this creates a massive pile of personal data on the device, as much as 200MB worth. Siri’s job is to use that data to glean insights about you that lead to increasingly helpful assists.

In iOS 10.3, more of your data feeds Siri and related features thanks to a new addition called iCloud Analytics. It’s currently opt-in, and Apple says it will use “data from your iCloud account in a privacy preserving manner.”

advertisement

Banking On User Trust

Apple’s competitors like Google and Facebook will likely become more creative and aggressive about leveraging user data and cloud computing to make their assistants increasingly helpful. They are advertising companies that deliver services from the cloud. Naturally they have a much different approach to collecting and leveraging user data. Their businesses depend on it.

Apple is increasingly a cloud services business. Delivering services like Apple Music and iCloud now contributes about a quarter of the companies revenues, and its share is growing. But Apple is, arguably, still a hardware company. Fully two-thirds of its revenues come from selling iPhones.

From a strategic point of view, Apple has the luxury of controlling the hardware platform on which users experience the end result of its AI work. And by keeping much of the basic machine learning models on the device where the users’ most personal data resides, the company can maintain its position as a dogged protector of user data privacy. That’s not just good PR, but smart business: Apple wants to sell you on devices that are personal, go-to tools for organizing your life, and that means being a trusted, secure lock box for your most personal and private data.

Actually, Apple’s device-centric approach could ultimately prove far better at learning about and analyzing your preferences and behavior than that of the other companies. “Because Apple is keeping the personal information within the protection of its own device,” Bajarin said, “you could make the argument that they are in the best position to learn intimate details about you in a way that others are not.”

In any case, the debate over Siri’s functionality—and Apple’s AI abilities—with respect to privacy may be mostly academic. At the moment, at least, it’s not easy to point to one crucial function that Siri is prevented from delivering as a result of Apple’s insistence on privacy, or from a dearth of user data.

Siri’s problems are more basic. The assistant doesn’t always provide answers that are as correct and robust as those from Google’s Assistant when the question involves internet search or mapping. (Siri relies on Microsoft’s Bing search engine, and its own Maps app.) Other issues are caused by garden walls. Like many people, I use Gmail for my personal email, but Siri is able to learn about the travel reservations I just booked only if the confirmation email shows up in Apple’s Mail or Calendar apps. Similarly, Siri can only answer music requests with songs from Apple Music, not from YouTube or other third-party services like Spotify.

advertisement

On the other hand, Siri often shines when it comes to providing help and answers that are personalized to the user. It might suggest a new News topic after noting that you read a lot of blogs on a specific topic. It might figure out that you normally work out on Tuesday and Thursday mornings at the same time, and start proactively suggesting a workout at those times.

A New Way Of Thinking

Siri has been around since 2011, but Apple gave the assistant a significant brain transplant only two years ago.

The transformation came in two parts. First, after relying for years on the Nuance voice recognition engine to understand what users were saying to Siri, Apple decided to begin developing its own voice recognition and natural language engines.

This move dovetailed with a larger shift in the way Siri’s brain works, toward cutting edge neural networks that very roughly mimic the brain’s neuronal structures. Before the change, Siri–including its voice recognition engine–used a rules-based approach in which new words or images captured by Siri were checked against a large knowledge graph for their identification or meaning. The assistant could understand things it had been explicitly trained to understand, but could not learn to understand new things.


Related: Why Apple Is Watching Google’s AI Progress Carefully

advertisement

With the introduction of artificial intelligence approaches to enhancing Siri’s cognitive skills, Apple began training its AI models to learn by themselves.

The goal of the training, as one Apple engineer told me, isn’t so much to get the model to successfully recognize the training data—it could be a set of images, or words or phrases—but rather to teach the model to comprehend a wide range of images or terms, such as they might encounter in real-world usage. The training, then, is more about teaching the model to continually test itself for correctness, and to become more accurate and adept at comprehending new things.

Apple’s new approach first paid dividends in Siri’s ability to understand users. Last year, many began noticing that Siri was stumbling over their utterances far less, and comprehending the meaning behind the utterances far more. As Apple collects more and more words from the mouths of more and different kinds of users, Siri will get even better at comprehension.

Just two years with a new way of thinking isn’t a very long time, especially compared with the amount of time some of Apple’s competitors have been developing and deploying AI. But even if Apple’s AI R&D isn’t quite as advanced as Google’s or Facebook’s, as some argue, the company may make up for it by being smarter about the way it applies the technology. If Apple can arm Siri with new AI-powered tricks people will really use—and earn users’ trust by protecting their data—the number of research papers its scientists publish about pure AI might not matter so much.

That kind of thinking is in line with the company’s user-centric approach, which Brian Croll, Apple’s VP of product marketing for software, described to me when I visited campus earlier this summer.

“Companies often invent technologies and it’s like they have a hammer and then try to find a nail,” Croll said. “We start with a consumer problem and then go to our toolbox of technologies and figure out which ones will help us build what we want to build.

advertisement

“I think that’s unusual in Silicon Valley and why Apple has been so successful with consumers,” he said. “Because consumers don’t really care about technology for the sake of technology—they care about what it does.”