Ahead of this year’s CES conference, Apple made a billboard-sized privacy statement: “What happens on your iPhone stays on your iPhone,” it declared. It was a thumb to the nose of tech companies (*cough* Facebook) that peddle users’ personal data to the highest bidder.
But a new report from the Guardian may have Apple rethinking its advertising campaign. According to the report, Apple’s digital personal assistant, Siri, occasionally and quietly collects data to send to its human contractors for quality control and to improve its listening skills. That “data,” though, can include recordings of confidential medical information, drug deals, and even couples having sex, recordings that most people would assume do stay on their iPhones.
Apparently, a small proportion of Siri recordings are passed on to contractors to grade its response. That includes whether the voice assistant was helpful, whether the response was appropriate, and whether Siri was activated accidentally or deliberately. According to the whistleblower who spoke with the Guardian, “The sound of a zip, Siri often hears as a trigger,” which could lead to some awkward recordings being sent for analysis. The contractor said those accidental activations were reported, “but only as a technical problem,” and there was no specific protocol for handling sensitive recordings like doctors’ visits or anything involving a “zip” sound.
While Apple says the recordings taken by Siri and sent for grading are “pseudonymized,” “not associated with the user’s Apple ID,” and “analyzed in secure facilities” by reviewers working under “strict confidentiality requirements,” it’s a long way from what happens on your iPhone staying on your iPhone. And it’s probably not what consumers expect from a company that considers user privacy a core tenet.
We reached out to Apple for additional comment and will update if we hear back.
The news comes in the wake of the departure of Siri’s longtime leader, Bill Stasior, who was replaced by senior VP of machine learning and AI strategy John Giannandrea. Siri, of course, is not alone in listening to users’ conversations. In April, it was revealed that thousands of Amazon staff listened to some Alexa interactions, and earlier this month, Google workers were found to be doing the same with Google Assistant.