After reporting a couple of weeks ago on how Amazon lets some workers listen to Alexa voice recordings, Bloomberg is back with a follow-up that says those workers can access users’ locations as well. In some cases, the Alexa audit team can see precise geographic coordinates, which they could easily look up in an app like Google Maps to determine someone’s whereabouts. Two sources apparently blew the whistle after concluding that Amazon was giving its workers overly broad access to personal data.
In response, Amazon told Bloomberg that “access to internal tools is highly controlled, and is only granted to a limited number of employees who require these tools to train and improve the service by processing an extremely small sample of interactions.” The company added that it regularly audits this access and has a “zero-tolerance” abuse policy. (Bloomberg didn’t find any evidence that workers had abused their access to customers’ data.)
For Amazon, listening to recordings from an Echo speaker or other Alexa device does serve a legitimate purpose, as it allows the company to improve speech recognition when something goes wrong. With location data in particular, the Alexa team uses that information to improve local search results. (Also worth reiterating: Amazon only collects audio samples when someone says “Alexa,” rather than constantly sending audio to its servers.) Still, Amazon doesn’t fully spell out the implications of this data collection, and says nothing about location data on the page where users can opt out.
To prevent Alexa workers from listening in, you can head to this page while signed into your Amazon account, select “Manage How Your Data Improves Alexa,” and disable “Help Develop New Features.”