During a week in which Facebook’s stock plummeted and regulatory bodies railed against the company’s data protection practices (or lack thereof), many critics are asking whether we could be seeing the beginning of the end for the company. That seems unlikely, but at the very least, the surge of public attention washing over the company is having an effect on users.
Across the web, #deletefacebook is trending–but another form of user dissent is as well. The work of researchers, designers, and artists interrogating the social media giant is reaching a new audience intent on understanding and influencing their own digital footprints. Their work, which includes apps, Chrome extensions, and desktop software, aims to give users some agency over their online lives. These tools are not a silver bullet for users’ problems in the least, and it’s worth installing any extension or app with caution. But they show us an emerging genre of research and design that experiments with the massive, infamously opaque platform–nudging, warping, or poking Facebook’s powerful algorithms in independent ways.
Knowing What Facebook Knows
Even the simplest question–what does Facebook know?–can be difficult to grok. To understand the kind of information the platform may have on you, and how it may use it, turn to Data Selfie, a project developed by the artists Hang Do Thi Duc and Regina Flores Mir last year with funding from the New York City Economic Development Corporation, the Mayor’s Office of Media and Entertainment, and the NYC Media Lab.
The Chrome extension generates a “selfie,” or profile, of your Facebook activity and uses machine learning to analyze that behavior in a way similar to Facebook itself. Are your likes more liberal leaning? What does your behavior imply about your psychological profile? Data Selfie–which doesn’t actually record any data from you–offers a glimpse into the kind of behavioral profiling that’s come to light through new revelations about Cambridge Analytica and the leak of data of 50 million Facebook users. Check it out here.
Nudging How Facebook Acts
Facebook’s algorithms exist within a black box. The company has never shared how it shapes your individual News Feed to your behavior, and it’s unlikely it ever will. But some researchers are finding a way to do their own “audits” that glean some insight into this algorithmic void. The researcher J. Nathan Matias, who founded the citizen behavioral science platform CivilServant at MIT and is now a postdoc at Princeton University, has blogged about his so-called “audits” over the past year on Medium–for instance, running his own experiments on how Facebook promotes images versus texts with colored backgrounds and an earlier experiment on the Pride reaction button. “How much can a single person learn about Facebook with a little patience and a spreadsheet?” he writes. “More than you might expect!”
Matias’s posts include instructions on how to run your own Facebook audit, and he even offers to help you do the statistics or coding if you want to run your own test. “I have often argued that we need independent testing of social tech, especially when a company’s promises are great or the risks are substantial,” he writes. “Sometimes when I suggest this, academics respond that independent evaluations require long, complex work by experts. That’s not always the case.” Learn more here.
Seeing Less Of What Facebook Serves You
Metrics are a flagstone of Facebook’s user experience, articulated through the little blue thumbs and smiling and sad faces below each post. These metrics are more influential than their relative size suggests. Ben Grosser, an artist and professor at University of Illinois at Urbana-Champaign’s School of Art & Design, has written about how these ubiquitous user interface elements deeply influence user behavior. He has also built several Chrome extensions that throw Facebook’s carefully honed algorithms into chaos–like lobbing a digital smoke bomb on your News Feed.
One, the Demetricator, removes any metric information on your feed, including likes, reactions, and even when a post was made (instead, it says “recently”). Another, Go Rando, chooses a random reaction–like “Sad” or “Angry”–making it more difficult for Facebook to build a reliable psychological portrait of you. Yet another released in 2017, Textbook, removes any imagery from your feed, an experiment meant to underline the role of image-heavy memes and political content on the platform.
Grosser says he’s seen a huge spike in interest about his Facebook-focused work this week, amid new revelations about Cambridge Analytica’s covert scraping of Facebook user data and a Washington Post story about his work. But he notes that he also just launched a version of the Demetricator for Twitter–a reminder that Facebook isn’t the only social network worthy of our critical thought as users. Check it out here.