advertisement
advertisement
advertisement

These are the deceptive design tricks and dark patterns that steer your clicks each day

The terms may be new to you, but you experience them nearly every day.

These are the deceptive design tricks and dark patterns that steer your clicks each day
[Photo: Bia Andrade/Unsplash; Lordalpha1/Wikimedia Commons]

When I visit my aging mother in the Midwest, I watch as she sits at her computer endlessly scrolling on Facebook. Like many of us, her use of social media has gone far past simple utility (“I need it to stay in touch,” she says) and into the realm of compulsive, even addictive, behavior.

advertisement
advertisement

Big Tech companies, whether they admit it in public or not, want their sites to be as “sticky” (read: addictive) as possible. That goal has deeply influenced the way they design their sites and services. They use a host of design tricks backed up by behavioral psychology to keep the clicks coming.

Georgetown Law School associate dean Paul Ohm, speaking at a hearing in Washington Tuesday, defined deceptive design this way: “These interfaces are intentionally designed to manipulate users, and here we get into the entire basket of behavioral and cognitive biases that behavioral psychiatry has been studying for many decades.” he said. “What’s critical is the effect of a UX or UI using a dark pattern—it induces a user to take an action that they would not normally take under normal circumstances.”

While the terms “dark patterns” and “deceptive design” may be new to you, you most likely see examples of them on your various screens practically every day. These are the most common ones.

Infinite scroll

The dark pattern we see most is perhaps the simplest one. When you scroll in the Twitter or Facebook newsfeed, or follow the suggested video path in YouTube, there is literally no end. The tech companies want users to keep scrolling down and down, or keep clicking the next video, with the idea that the next endorphin hit from a juicier article, a cattier comment, or sexier video, or a new “like” or a “share” might be the biggest one yet. We’re so used to this now that we barely notice it.

This kind of design is meant to create something the psych world calls “variable reinforcement schedules.” It’s the variability in the reward, the unpredictability of when they’ll come, that keeps us scrolling and scrolling and clicking and clicking. It forms a habit in the brain that can lead to compulsive behavior (hear that Mom?).

Natasha Schüll, who authored the book Addiction by Design, believes tech companies have borrowed techniques honed in the gambling industry to keep users engaged. Slot machines used variable reinforcement schedules (those unpredictable rewards) long before social media to keep players sitting there hour after hour.

advertisement

Share til it hurts

Facebook may be the grand champion of deceptive design: It goads users to “connect more” and “share more” without adding that the personal data it gains from those activities is the very oxygen of its massively profitable business. “[They’re] telling them that they should add more contacts so that they can ‘get more out of friends’ without telling them that that also means that they would know much more about your personal connections, which could include any doctors or lawyers in your life,” said Consumer Reports policy counsel Katie McInnis.

“Misleading prompts to just click the ‘okay’ button can often transfer your contacts, messages, browsing activity, photos, or location information without you even realizing it,” says Senator Deb Fischer (R-NE), who is co-sponsoring a bill with Senator Mark Warner (D-VA) that would restrict deceptive design practices used by tech companies. “Our bipartisan legislation seeks to curb the use of these dishonest interfaces and increase trust online.” Fischer and Warner are members of the Senate Commerce Committee.

Privacy odyssey

Many tech companies design their privacy settings to push users to “agree” as the default option, while users looking for more privacy-friendly options often must click through a much longer process with multiple screens or confusing choices. Facebook privacy choice design may be the masterclass in dark patterns.

This section of a letter the Consumer Union wrote to the Federal Trade Commission on dark patterns last summer nicely outlines the variety of techniques used by Facebook to steer users toward sharing more information.

These tactics include: (1) barring a user from making changes to their privacy settings before signing up for an account, (2) directing the user through a confusing dashboard of policies to learn how to change their settings, and (3) requiring the user clicks and/or swipes multiple times to alter their advertising preferences. Second, Facebook makes the privacy-protective option more cumbersome by requiring many more clicks and/or swipes for a user to limit the collection of their personal information. Third, Facebook frames various privacy settings to only focus on the benefits—and not the disadvantages—of turning on or allowing settings that collect and disseminate personal information.

The reason Facebook makes it so easy to reduce privacy and so hard to add privacy is that its whole business is based on using personal information it harvests from users to target the ads. And those ads contribute almost all the company’s revenue.

Fear of loss

A common tactic used by e-commerce sites–especially travel and hotel booking sites–is a window that pops up saying “45 people are also viewing this product/property/reservation.” Princeton computer scientist Arunesh Mathur, who built a bot to study dark patterns used by internet shopping sites, found that the “45” number is often the product of a random number generator, and have nothing to do with how many people might be looking at the product. The idea, of course, is the oldest sales tactic in the world, to create a fear in the browser that one of those 45 other users might buy the product first, leaving the shopper out of luck.

advertisement

Other shopping sites display messages saying things like, “Bob from Delaware just saved $34 on his order” next to a product the user is viewing. This technique, Mathur says, capitalizes on the shopper’s desire to fit in with the crowd (Bob) and also their fear of “missing out” on buying the product. But here too, Bob was generated by a piece of code in the shopping platform.

Can’t cancel

Have you ever tried to cancel an Instagram account? First of all, you’ll have trouble even finding the cancel option, the front door to a process involving multiple screens and information requests. You may become confused, as I was, about whether the account can be cancelled from within the app or if it must be done at the website (the latter is the only way). In the end, you may be stopped completely after the app gives you no easy option for preserving the photos you’ve already shared with the service. You have to go to another website for that, and it will take Facebook, Instagram’s owner, up to 48 hours to send you your stuff. You may encounter many different forms of this cancellation odyssey in many different apps, but the goal of the developer is always the same—to frustrate you to the point that you give up and keep your account open.

Buzzes and bleeps

Deceptive design isn’t just confined to social media and e-commerce platforms, but has influenced the design of smartphones and other computers. While Apple likes to talk about user privacy and helping users manage the time they spend on their phones, it also continues to build haptic feedback into iDevices, which can be used by apps to buzz our brains back into tech’s dangerous scroll-and-reward cycle. Our phones continue to beep and bloop and alert us back to attention. Our brains are so used to this now that we sometimes feel phantom buzzes in our back pocket, or hear phantom alerts in our ears. In this way our devices, and the apps that run on them, hijack our brains and return our attention to the digital realm. It’s all by design.

advertisement
advertisement