Apple has announced that it will suspend its Siri grading system worldwide following a report last week that the company was using human contractors to listen to snippets of Siri recordings in order to grade the accuracy of the digital assistant.
Grading is a common technique makers of voice assistants use to compare what users said to an assistant and what the assistant thought the users said. Human contractors are used by companies like Amazon, Google, and Apple to review, or “grade,” these interactions, which are carried out by listening to voice recording snippets of users interacting with the assistant.
All three companies have gotten in hot water recently after various spotlights were shined on their human grading systems. While the companies’ terms and conditions state that small samples of assistant queries may be recorded and used in reviews to improve the service, many users don’t read the T&Cs and thus become alarmed when news of such grading systems go widespread.
The news of Siri’s grading system was perhaps more alarming than finding out Amazon and Google use human grading systems because of Apple’s stance on privacy. But in a statement to TechCrunch, Apple says that as of now the company is suspending its Siri grading globally. And even better, in the future, Apple will release a software update that will allow users to opt-out of having their Siri queries included in the grading process. Apple’s statement in full:
We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.
There’s no word yet on when Apple’s Siri grading system will resume again, nor when users will be given the ability to opt-out, but it’s likely both will go hand-in-hand.