The Pact. You’re probably part of at least one. Here’s how it works: You use a digital product. The way you use that app and gadget gives the company that built it data about your behavior. Maybe you see a more specific ad in your feed. Maybe your keyboard gets better at predicting that weird word you always use. Maybe your data is used, along with millions of other data points, to train a neural network. But whatever the case, we tend not to fully understand how our data is being used. And most of us ultimately don’t really care. We count it as part of The Pact. Besides, it’s not always easy to find out how our data is being used.
Yesterday, Amazon debuted a new $199 Echo device, called Echo Look, which functions like a standard Echo but includes a camera, light, and depth sensor, all of which can photograph your outfit but also likely glean detailed information about your body and space, given its depth-sensing technology.
Echo Look has an app that uses machine learning and trained human stylists to provide feedback on how you look–a product Amazon calls Style Check–in the form of a percentage rating between two outfits. It’s basically a new way to take selfies, marketed mostly toward women, based on the launch video. (Isn’t technology amazing . . . at perpetuating society’s endless demands of women?)
It’s unclear whether there’s a Pact at work here, but as many privacy advocates pointed out, users should generally be hesitant to bring a connected, AI-powered camera into their bedrooms. Zeynep Tufekci, the New York Times contributor and assistant professor at the University of North Carolina, Chapel Hill, pointed out that the data collected by Echo Look could be analyzed to glean much more about you than your outfit’s cuteness. “With this data, Amazon won’t be able to just sell you clothes or judge you. It could analyze if you’re depressed or pregnant and much else,” she tweeted. As Co.Design‘s Mark Wilson has written, machine learning is capable of detecting illness with startling accuracy, ushering in a new era of privacy and ethics concerns.
All this to sell you more clothes. We are selling out to surveillance capitalism that can quickly evolve into authoritarianism for so cheap.
— Zeynep Tufekci (@zeynep) April 26, 2017
So how does Amazon plan to use all those selfies and outfit pics? Will it use images as training data to improve other products like Amazon Rekognition, the deep learning-powered image recognition service that Amazon Web Services debuted last year? Over email, the company noted “rigorous controls” on the data, and told Co.Design that specific personnel will have access to it to “improve our services,” but didn’t say what services that included beyond Look’s Style Check feature.
So, unsurprisingly, it’s unclear if or exactly how Amazon plans to use all that data. Amazon said that it will give users clothing recommendations based on their “looks.” But will it use information from your images to sell you other products, like cat litter if it sees your cat, or new T-shirts if it notices an errant rip or pit stain? (“We’re supposed to recommend clothing, but nothing can fix this,” my coworker joked. “We chipped in and bought you a gym membership.”) The spokesperson said that the app “may display interest-based advertising using information customers make available to us when they interact with our sites, content, or services,” noting that “we do not provide any personal information to advertisers or to third-party sites that display our interest-based ads.” The company didn’t say whether there would be an opt-out option for sharing your images or videos, or for how long it would store your data, but noted that you can delete your data when you want to.
It’s easy to feel ambivalent about digital privacy; after all, it’s part of The Pact. But Echo Look, however it really uses data, is part of a much larger phenomenon being ushered in by the age of big data and machine learning. We might not care how our data is used to sell us things today–but what about when it’s used to determine whether you get a loan, or are fit to adopt a child, or any number of life-changing personal events? We need to start demanding that companies disclose how they’ll use, or sell, our information. It shouldn’t be that difficult to understand where your photos and videos end up. Designers have a role to play here as well, as advocates for users who can create clear, concise disclosures about the way a product collects and uses their information, and tools that help us opt-out.
But the movement toward more transparency–which Co.Design staffers have taken to calling “lucid design”–is still in its infancy. And unfortunately, as long as we keep buying the next selfie machine or clicking “accept” on that Terms of Service prompt, we don’t have much leverage.