If you own an Amazon Echo product, you should be aware that there is a chance Amazon workers are listening to recordings of your conversations. As Bloomberg reports, Amazon has a team of thousands of workers–some employees, some contractors–whose job it is to listen to voice recordings captured from Amazon Echo owners.
These workers are based around the world in countries such as the United States, Costa Rica, India, and Romania. Workers listen to, transcribe, and annotate private voice recordings captured from Echo users. The goal of this is to help improve the reliability and usefulness of Alexa, Amazon’s personal assistant that runs on Echo devices. While workers Bloomberg spoke to describe most of the recordings as “mundane,” sometimes the recordings captured can be upsetting:
Sometimes they hear recordings they find upsetting, or possibly criminal. Two of the workers said they picked up what they believe was a sexual assault. When something like that happens, they may share the experience in the internal chat room as a way of relieving stress. Amazon says it has procedures in place for workers to follow when they hear something distressing, but two Romania-based employees said that, after requesting guidance for such cases, they were told it wasn’t Amazon’s job to interfere.
For what it’s worth, Amazon isn’t saving recordings of Echo users behind their backs. Users have the option to disable Amazon’s usage of their voice recordings to improve Alexa, yet many users don’t know the opt-out exists or that Amazon is even using recordings of their conversations in such a way. In a statement to Bloomberg, Amazon said:
We take the security and privacy of our customers’ personal information seriously. We only annotate an extremely small sample of Alexa voice recordings in order [to] improve the customer experience. For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone.
We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow. All information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption and audits of our control environment to protect it.
Both Apple and Google also use voice recordings to improve their assistant products, and both companies strip all identifiable information about the individuals in the recordings. However, Bloomberg’s report found that it does not appear that Amazon is anonymizing the information it collects. Amazon employees also told Bloomberg they often hear audio files that appear to have begun recording despite the wake word “Alexa” never being used.
Update: Amazon reached out to us with a statement explaining how Echo devices are designed to know when to listen to a user’s conversations. Per an Amazon spokesperson:
By default, Echo devices are designed to detect only your chosen wake word (Alexa, Amazon, Computer or Echo). The device detects the wake word by identifying acoustic patterns that match the wake word. No audio is stored or sent to the cloud unless the device detects the wake word (or Alexa is activated by pressing a button).