advertisement
advertisement

Dark Interactions Are Invading Our Lives. Where’s The Off Button?

As things like AI, facial recognition, and motion-tracking become more sophisticated, we are taking part in an exploding number of human-computer interactions–and we don’t even know it.

Dark Interactions Are Invading Our Lives. Where’s The Off Button?
[Image: StudioM1/iStock]

Legendary computer scientist Mark Weiser famously said, “the purpose of a computer is to help you do something else.” It’s a beautiful sentiment, but it hasn’t come true. We live in constant contact with machines. And that contact is growing. As Ian Bogost points out in the Atlantic, we can’t get enough computers in our lives. Everything from our cars and homes to our garden hoses and bike locks are now connected. This creates hundreds if not thousands of little interactions to manage.

advertisement
advertisement

But there’s a catch. At this point we can only manage the interactions we actually know about. As things like AI, facial recognition, and body movement tracking become more sophisticated, we are taking part in an exploding number of human-computer interactions–and we don’t even know it.

[Image: StudioM1/iStock]
SAP chief designer Martin Wezowski called these unknown human-computer exchanges “dark interactions,” a twist on the concept of dark data. We’re used to the idea of bright interactions–those everyday swipes, taps, and verbal exchanges we have with our phones and laptops. Dark interactions are the opposite–all the interactions between people and machines that aren’t visible and that we don’t consciously make.

Dark interactions aren’t inherently bad. They’re a crucial part of the interaction-design ecosystem, hiding the nuts and bolts of our digital products to ensure a smooth user experience. And as artificially intelligent systems become more sophisticated, these unconscious “conversations” will be the primary way we interact with machines. Smart systems will track our movements and preferences to provide personal services without the need for command and control interactions. Self-driving cars will show up at our front doors when we need them. Our refrigerators will stock foods we like and send back food we don’t. Dark interactions can help designers create a world in which computers serve up well-timed experiences.

But they’re also easy to abuse.

[Image: StudioM1/iStock]
We got a first look at that in 2012 when Target began tracking the shopping habits of a young woman buying baby gear and prenatal vitamins. The woman didn’t know she was feeding data to a computer system, but the information she was giving away was enough for Target to create an accurate profile of her shopping habits. The store then started mailing coupons to her address. Turns out she was a teen living with her father, and Target exposed her pregnancy before she had a chance to tell her dad.

Five years later, this kind of behind-the-scenes tracking is more pervasive–and invasive–than you think. One of the most frightening examples is in Xinjiang, China, a state where most all the country’s Muslim Uighurs live. Here, the national government has installed sophisticated surveillance technology. That  includes apps that track internet usage as well as the use of cameras combined with artificial intelligence to collect facial recognition data and track body movements such as “gait signatures”–the unique way each of us walks. The Chinese government is creating a detailed digital library of every person’s online activity and real-world physical movements. As BuzzFeed writes, this is “what a 21st century police state really looks like.”

advertisement

One of the most alarming aspects of this is how the Chinese government is tracking people’s movements in the public sphere. Traditionally the public sphere promises anonymity, and in the West we value it for that reason. We can walk down the street without anyone recognizing us, browse a store without being challenged to buy, or join a protest with a reasonable assurance of anonymity.

[Image: StudioM1/iStock]
But that assurance is increasingly elusive, and we are being tracked in vastly more ways than we’re aware. Location sensors in our phones and cars track our driving and walking patterns. CCTV cameras are measuring our body movements and logging our gait signatures. Retail stores are mapping our shopping patterns. Even things as mundane as children’s toys and vacuum cleaners may be breaching our privacy. We have an inkling that we’re being measured online when an Amazon ad pops up or Facebook reveals people we should connect with. But we have no idea how it’s done, and the Amazons and Facebooks of the world aren’t telling us their secrets. With dark interactions, most of us don’t even know when we’re being tracked or measured.

In the United States, we have laws that prevent certain kinds of tracking, but they are woefully out of date. They deal mainly with how data is collected on the internet, not how dark interactions are collected in the real world–through cameras, sensors, and the digital appliances that surround us. That has to change. There’s a short technical distance between a business tracking our movement to deliver timely promotions and the scary monitoring going on in China. What is happening there will arrive here soon enough if we don’t take action. We need to make sure privacy laws are amended to focus on the impact of dark interactions.

But we can’t simply wait for the law to catch up. Individuals should have access to and control over the data captured about them. My modest proposal: As the world around us is rapidly populated with sensors and systems to track our behavior, the next generation of connected devices needs a switch–a mute button for personal privacy that is as readily accessible as today’s mute switch on the side of my iPhone. It would not only shut down how a device recorded or reported my activity; it would also signal to any nearby sensors that I’m demanding the right to be ignored. It’s not enough, and it may be ridiculously optimistic given today’s penchant for companies to collect information without our permission, but it’s a start. We deserve a choice in how we navigate the public sphere–and we deserve to retain control over who we are.

advertisement
advertisement

About the author

Mark Rolston is cofounder and chief creative officer of argodesign. Previously, he led frog's global creative vision.

More