Fast company logo
|
advertisement

Many user interfaces manipulate users into actions that benefit a company’s bottom line. The FTC is concerned—but it’s not clear what actions it can take.

Can the FTC stop the tech industry’s use of ‘dark patterns’?

[Source photo: Nenad Stojkovic/Flickr]

BY Rob Pegoraro4 minute read

You may not recognize the term dark patterns, but you’ve probably seen enough manipulative interfaces to get the idea. A user experience in a site, app, or gadget is constructed to herd customers into following a company’s dictates, even if those will cost people their money or data. Now one of Washington’s consumer regulators is asking how the public sector could address this private-sector plague.

At the Federal Trade Commission’s “Bringing Dark Patterns to Light” online workshop April 29, speakers uniformly denounced these deceptive interfaces in apps, services, and sites. “We increasingly see companies using dark patterns to manipulate consumers into giving up their data,” acting FTC Chair Rebecca Kelly Slaughter said as she opened the online event.

But it was not so clear what the feds should do next, and if any steps would require new legislation to strengthen the FTC.

Dark patterns can take many forms—a tiny “cancel” or “decline” button, a subscription that’s far harder to stop than to start, legalese shown too quickly or in type too small for anybody to scan—but they all serve to condense a customer decision point into a dialog with one button: “submit.”

The curse of growth hacking

As Harry Brignull, a design researcher in the U.K. who coined the phrase “dark patterns” in 2010 and maintains a site to explain the problem, said at the event: “People can end up purchasing things, sharing things, or agreeing to legal terms without intending to.”

Researcher Johanna Gunawan noted that mobile devices have become fruitful ground for dark patterns, citing services that omit account deletion functions in mobile apps and sites. “There’s no excuse for not allowing users to leave a service in the same place they signed up for it,” she said.

Spotify product designer Kat Zhou, a speaker at the event, blamed the tech industry’s growth-hacking habits. “Companies must ruthlessly prioritize for growth,” she said. “Promotions and raises are often linked directly to these conversations.”

Zhou added that the decentralized structure of large tech firms can impede well-meaning employees: “You find out about something that was shipped to the public that was made by a team five time zones over.”

(Spotify’s site could itself represent a dark pattern: It lacks its app’s private-session option to override the service’s nonobvious default of making your listening habits public. But that did not come up during the FTC event.)

Meanwhile, companies can make real money with dark patterns, as Brignull noted by citing a study that found people spent 21% more at a ticketing site if they weren’t shown service fees up front. “Imagine if you ran a business and you could press a button to get your customers to spend 21% more,” he said. “Of course you’d press it.”

Companies that use dark patterns carefully may get away with manipulating consumers’ behavior without pushing the envelope far enough to anger them. Lior Strahilevitz, a professor at the University of Chicago Law School, shared research into how subjects responded to moderate and aggressive dark patterns in dialogs urging them to sign up for identity-theft protection.

The moderate dark patterns got more than twice as many people to sign up—26% versus 11% in the control group—without leaving subjects angry. “There’s no backlash for companies that employ these techniques, if our results are externally valid,” Strahilevitz said. “They can employ just a couple of dark patterns and get away with it.”

He and subsequent speakers noted that dark patterns work more effectively against users without a college education. But they can also prey on older users who, as University of Illinois at Chicago communication professor Kelly Quinn noted gently, “have not always developed the abilities to understand how technologies work.”

advertisement

The most rage-inducing part of the FTC event involved how dark patterns affect children—a problem that’s grown worse as the pandemic has left stressed parents resorting to YouTube and other apps to keep cooped-up kids entertained. “Kids have immature executive functions,” said Dr. Jenny Radesky, a professor of pediatrics at the University of Michigan Medical School. “They will follow lures, they will follow rewards.”

Ads shown to kids may not request money, but by persuading kids to install a new game they can still enable valuable data collection, Radesky added.

Fighting back

What should the government do about these interface insults? Speakers noted that the DETOUR (Deceptive Experiences to Online Users Reduction) Act, legislation first proposed in 2019 by Democratic Senator Mark Warner of Virginia, would ban “obscuring, subverting, or impairing user autonomy, decision-making, or choice” via interface manipulation.

I think there’s a lot the FTC can do with current authority.”

Lauren Willis, LMU Loyola Law School
Democratic Representative Lisa Blunt Rochester of Delaware, who sponsored a version of the DETOUR Act in Congress last year, urged action in a talk at the start of the event: “In the absence of responsible action from these companies, it is my firm belief that Congress should act.”

But in his own comments, Warner suggested that the commission already had enough authority under the section of its founding legislation empowering it to police “unfair or deceptive acts or practices.”

“I believe under Section 5 authority, that you can put in place a regulatory structure to try to protect consumers and prohibit this kind of deceptive practice,” he said.

Indeed, in 2017 the FTC settled a case against consumer-electronics maker Vizio for a privacy-eroding interface on connected TVs—although that $2.2 million fine may not have left much of a bruise.

“I think there’s a lot the FTC can do with current authority,” says Lauren Willis, an associate dean and law professor at LMU Loyola Law School. Her suggestion: Have the FTC make more of an example of high-profile offenders.

Inflicting public pain on the most egregious boldface-name dark-pattern practitioners could also sidestep debates about what does and does not rate as a dark pattern. As Strahilevitz asked: “Is nagging protected by the First Amendment as a sales strategy?”

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Rob Pegoraro writes about computers, gadgets, telecom, social media, apps, and other things that beep or blink. He has met most of the founders of the Internet and once received a single-word e-mail reply from Steve Jobs. More


Explore Topics