advertisement
advertisement

The fight to regulate dangerous design is heating up

A new regulatory proposal in the U.K. lays out how online services should be designed to protect kids. Everyone should get the same kinds of protections.

The fight to regulate dangerous design is heating up
[Source Photo: d3sign/Getty Images]

Yesterday, the U.K.’s Information Commissioner’s Office (ICO), which oversees data and privacy, announced a radical proposal to regulate children’s privacy online. The proposal includes 16 points aimed at safeguarding kids’ data, and it’s designed to limit children’s ability to engage in addictive behavior on social media. That could go so far as to ban “likes” on platforms like Facebook and Instagram and “streaks” on Snapchat for young users.

advertisement
advertisement

These interface features are designed to both convince users to keep using the platform and to enable the company to extract data about their activity, providing a window for advertisers into who users are. The ICO calls these “nudge techniques” and “reward loops,” and decries their ability to psychologically manipulate children, convince them to give up data, and compromise their privacy.

If adopted, the proposal would force apps, connected toys, social media websites, search engines, and other online services that are based in the U.K. or have British users to comply with its rules around kids. But it also outlines what a privacy-first internet might look like–something that would benefit all users, not just kids.

The proposal’s 16 principles, called the Age Appropriate Design Code, would ensure that internet companies turn on the strictest privacy settings by default, provide bite-sized explanations of how a company is using a child’s data, collect as little data on kids as possible, and generally design their services in the best interests of children, according to existing standards set by the UN. Each of these elements would be based on a child’s age, and the code lays out recommendations for how companies can be compliant for each age group: pre-literate and early literacy (ages 0 to 5), core primary school years (ages 6 to 9), transition years (ages 10 to 12), early teens (ages 13 to 15), and approaching adulthood (ages 16 and 17). Many of its stipulations elaborate on elements of the General Data Protection Regulation, the sweeping data governance regulation in the European Union, which is the world’s strictest privacy law but has been criticized for its vague terms.

[Source Photo: Halfpoint/iStock]
Right now, the code is open for comment until the end of May, before the ICO drafts a final version to present to the British Parliament, which would vote on it. In a blog post about the code, the ICO states: “We expect it to come into effect before the end of the year.” If the measures are accepted, they would act like an addendum to the GDPR that’s specific to kids.

Child welfare advocates have been demanding government action as tech companies have started to target children more aggressively. After Facebook announced a Messenger service directed at kids under age 13 in 2018, the advocacy group Campaign for a Commercial-Free Childhood sent a letter to Mark Zuckerberg with more than 100 signatures from childhood development experts and other advocacy groups that urged him to discontinue the app because of research showing that social media can be harmful to kids’ development. (They were not successful.) The CCFC has also submitted complaints to the FTC regarding Facebook, YouTube, and Google’s disregard of the Children’s Online Privacy Protection Act of 1998 (many internet services claim that their users must be over age 13 to avoid complying with COPPA). After Reveal found that Facebook tricked kids into making in-app purchases without their parents’ knowledge or consent, CCFC and other advocacy groups asked the FTC to investigate whether Facebook had violated U.S. law, and lawmakers expressed anger over Facebook’s inability to protect its youngest users. So far, the FTC has yet to investigate Facebook, YouTube, or Google for their violations of kids’ privacy. However, in February, the agency issued a $5.7 million fine against the music-sharing app TikTok for violating COPPA, and there are reports that the FTC is close to fining Facebook for other privacy-related offenses surrounding the Cambridge Analytica scandal.

“The government must step in to protect kids and families from persuasive design and social media,” David Monahan, the campaign manager of CCFC, tells Fast Company via email. “A lot of time and research is put into getting vulnerable kids hooked on these features, so they’ll spend hours on social media, make online purchases, and be rich targets for advertisers. It’s not a fair fight.”

advertisement

But kids aren’t the only ones who suffer from persuasive design elements or dark patterns, which are features of a user interface that aim to convince users to do something that’s not in their best interest. Adult users are also vulnerable to many of the nudge tactics that companies use to convince people to give up their privacy: Even if adults are theoretically capable of reading all the words of a lengthy privacy policy that a 5-year-old couldn’t, research shows that no one does. Plus, many companies may not realistically differentiate between adults and children who are using their services–as a result, the Age Appropriate Design Code recommends that unless a company has “robust age-verification mechanisms” in place, the privacy guidelines it lays out should be applied to all users.

The U.S. government is also starting to pay attention to the dangers of misleading design for all users. Earlier this month, U.S. senators unveiled a bill aimed at eliminating deceptive design and dark patterns used by companies with more than 100 million users. Similar to the ICO’s code, this bill outlaws different forms of analysis on children’s data (like when Facebook told advertisers it could determine when teens were feeling insecure).

Designers are complicit in building interfaces that take advantage of children. A/B testing, which is prevalent in the design community, tends to help designers optimize for–and manipulate–user behaviors in a way that would benefit the company, rather than users. Even something as simple as a color can have a psychological impact on users; in fact, the ICO’s proposed rules explicitly state that companies should not “exploit unconscious psychological processes (such as associations between certain colors or imagery and positive outcomes, or human affirmation needs).”

Companies’ continued quests for engagement and stickiness–such as on platforms like Messenger Kids and YouTube Kids–shows that they can’t be trusted to truly put the needs of users first, which has forced regulators to step in. However, the Age Appropriate Design Code could suffer from some of the same criticisms leveled at GDPR, particularly around technical feasibility and vague language. After all, the code provides the opportunity for companies to have less private default settings as long as they somehow prove that there’s a good reason for doing so, and it’s in the best interest of the child–but it’s not clear how that would be evaluated or enforced. Still, it does offer concrete design solutions for compliance, like ensuring that location tracking or parental monitoring is obvious to the child in the interface, or providing the ability to turn on personalized video recommendations but not personalized ads.

This standard would also only apply to children in the U.K., which limits its impact beyond the country’s borders. But European countries have consistently led the way on privacy, and regulation like the Age Appropriate Design Code provides a model for the rest of the world. GDPR has opened the door to privacy regulation in the U.S., and California’s GDPR-copycat law will go into effect in 2020.

The code is a step forward in protecting kids online, who are forced to grow up without the privacy that previous generations enjoyed. And it’s beneficial for the rest of us, too. Because ultimately, nonaddictive interfaces are just good, human-centered design.

advertisement
advertisement

About the author

Katharine Schwab is the deputy editor of Fast Company's technology section. Email her at kschwab@fastcompany.com and follow her on Twitter @kschwabable

More