Facebook and Google aren’t that interested in putting social content in front of you that will broaden your horizons or introduce you to a diversity of viewpoints. They’re interested in finding out what you like and then putting as much of that stuff in front of you as possible. In recent years these companies have put some guardrails around the kinds of content they’ll use for that, but the main idea is still the same.
Furthermore, the algorithms tech companies use to curate content live in a black box. They’re a secret, even though their work can have serious social and political consequences. A new project by the nonprofit tech watchdog news site The Markup called the Citizen Browser Project hopes to demystify those algorithms, and maybe explain why I see a completely different Facebook than my neighbor does.
“Social media platforms are the broadcasting networks of the 21st century,” says The Markup‘s editor-in-chief, Julia Angwin. “They dictate what news the public consumes with black-box algorithms designed to maximize profits at the expense of truth and transparency.”
The Markup has empaneled a group of 1,200 people who will use a special browser when they browse Facebook and YouTube. The browser will send back information in real time about the content being served up to the participants—just the content served, not the way users interact with it. In time the data will create a statistically valid sample that delivers insights about how Facebook’s and YouTube’s algorithms work.
Angwin says the view from the browser will enable researchers at The Markup to make connections between the personal characteristics or demographics of a given user, and the content they are shown. It will reveal differences in the way the platforms serve content to Black users versus white users, for example, or conservative users versus liberal ones, she says. She believes that the browsers will show whether some users see more harmful content, such as COVID-19 misinformation, more than others.
The truth of an algorithm isn’t in the way it is written but rather in the results it produces.
The Markup’s research subjects will get started using their browsers within the next few weeks, “hopefully before the election,” says Angwin. They’ll continue using them at least until the presidential inauguration in January.
Angwin is a Pulitzer Prize-winning ex-Wall Street Journal and Pro-Publica journalist who has written extensively about the algorithms used by tech companies. I asked her if she thinks tech companies should be required to make their algorithms publicly available, as some, including right-wing politicians, have called for. “Based on my experience over the years, I don’t think that’s necessary, because even the simplest algorithms are hard to understand,” she says.
Angwin points out that the truth of an algorithm isn’t in the way it is written but rather in the results it produces. She adds that Big Tech companies are increasingly using machine learning to suggest content, in which case there’s no real algorithm to see. It’s just a complex system of probability measurements constantly reacting to new data.
The Markup plans to publish the findings of its browser study in a series of articles in the New York Times next year.