advertisement
advertisement

Designers: You’re Policy Makers. It’s Time To Act Like It

Technology is shaping our society, and designers are making the decisions of lawmakers.

Designers: You’re Policy Makers. It’s Time To Act Like It
[Photo: Kelly Sikkema/Unsplash]

The camcorder–the first mass-market device that recorded video and audio simultaneously–did not have a mute button, something that would allow you to record images but not audio. That doesn’t sound like a big deal. But this simple design decision actually encouraged U.S. citizens to violate laws against recording people’s conversations without their permission.

advertisement
advertisement

In 2010, a motorcyclist named Anthony Graber was pulled over for reckless driving by a cop, who proceeded to pull a gun on Graber. Graber began recording the entire exchange and later uploaded it to YouTube–and as a result, was charged with violating the state of Maryland’s wiretapping laws. That’s because the act of recording someone’s voice goes against such wiretapping laws, which were originally instituted to protect people’s privacy. But when using a device like a camcorder that was designed without a mute button, users aren’t given the choice–even if they just want to videotape, they must also record their subject’s voice without their consent, which violates wiretapping laws.

Of course, things have changed since then. Many police officers now wear mandatory body cameras. But for computer scientist Latanya Sweeney, a professor of government and technology at Harvard who runs the Data Privacy Lab, the lack of a mute button on camcorders is an example of how designers have unwittingly become the new public policymakers. The design of the device doesn’t even give people an option to not record sound, which means that “automatically it was going to put us in the U.S. at odds with our laws,” she says, speaking at the recent Fairness, Accountability, and Transparency (FAT) Conference in New York City.

[Photo: rawpixel/Unsplash]
This means that citizens are at the whims of their technology–and the people who design that technology have unprecedented power. “We do live in a technocracy. The design of the technology and how it works is the new policy,” Sweeney says. “The thing about designers as policy makers is that we didn’t vote for them, we don’t have any say in the things they believe in, but the decisions they make become the rules we have to live by.”

It’s a powerful idea. The designers of the camcorder probably didn’t intend to not provide a mute button, and likely didn’t have any idea it would encourage people to break the law by videotaping and recording police officers without their consent. But it’s the unintended consequences of technology that can be the most dangerous; just look at how Russian operatives were able to use social media to influence the 2016 election, or how Facebook allows advertisers to discriminate based on age and race on job and housing ads.

Once these decisions are designed into our products and they’re already out in the world, large companies have a strong financial incentive to keep them the way they are. Our best bet, Sweeney says, is for a designer to suggest that metaphorical mute button during the development process–not after the product is out there in the world.

Sweeney acknowledges that anticipating how the technology you’re building might enable people to break the law is incredibly difficult. Fundamentally, it’s the job of designers–the people sitting one layer above building the technology itself–to think through its implications. She believes this process doesn’t really happen within tech companies today. “I don’t think those questions get asked. And in companies where it does get asked, it’s asked tongue in cheek,” she says. “They might bring in a user group or an academic or a lawyer, and say this is the new technology that we’re going to produce. But there’s no real depth in exchange.”

advertisement

The mute button is an example that’s easy to visualize, but Sweeney sees the unintended harm of thoughtless design decisions everywhere. When she first started working at Harvard, Sweeney found that when people Googled her name, they were shown ads that indicated she may have been arrested–which isn’t true at all. It led her to groundbreaking work showing that Google’s system showed ads insinuating someone had been arrested far more when a person had black-sounding name versus a white-sounding name. In other words, it was using racial profiling. Sweeney says it was the first case that showed technology violating the Civil Rights Act.

At the FAT conference last week, she urged the programmers and academics in the room to continue to look for instances where technology violates the law. She pointed to a group of students of hers who were able to find instances of discrimination in the online pricing systems of Airbnb and the test prep company Princeton Review during a short one-semester class. “I tell you these stories because it tells us how low-hanging the fruit is,” she added. “It shows us how it is easy to find these unforeseen consequences and have an impact on the business practice or regulation.”

How do designers go about foreseeing how the things they make might discriminate against certain groups of people? Sweeney doesn’t have an answer. But perhaps designers could start by brainstorming potential consequences–not just imagining all the ways to solve a problem, but thinking about the problems their solution might create, too.

advertisement
advertisement

About the author

Katharine Schwab is an associate editor based in New York who covers technology, design, and culture. Email her at kschwab@fastcompany.com and sign up for her newsletter here: https://tinyletter.com/schwabability

More