The internet is a morass of things you don’t want to see, but can’t stop looking at. The artist Ben Grosser has a solution: Safebook.
The Chrome and Firefox extension, which launched over the weekend, removes every bit of content from the platform, leaving behind only the user interface prompts that are still fully functional. “Concerned about the impacts of Facebook on your mental health, but don’t want your friends to miss out on those ‘likes?,'” Grosser writes. “Try Safebook, Facebook without the content.”
In other words, Safebook is Facebook without the faces. It is a half elaborate send-up and half earnest investigation of the social network’s inner mechanisms. Grosser, who is based at the University of Illinois at Urbana-Champaign, has launched a handful of other extensions for Facebook and Twitter that tinker with the way each platform works.
On his own Facebook wall, Grosser can often be seen running experiments and trying out coding tricks, gauging the reactions of his friends and asking for feedback. One extension I’ve had installed on my computer for a few months, Demetricator, scrubs the user experience of any metrics, like the number of likes a post has, or the time it was published. Another, called GoRando, will respond to any post with a random reaction. These extensions turn social media platforms, where the user so rarely has any agency, into “spaces of experimentation,” as Grosser told me earlier this year.
Safebook goes much further, removing any kind of text or imagery and turning the News Feed into a benign cruise through muted shades of blue and pinstripe wireframes. It should make Facebook completely unusable. But Grosser, whom I emailed after I tried the new extension, points out that it doesn’t.
“I find it uncanny how well I can still navigate the site with Safebook installed,” he responds. “Sure, I don’t know what someone posted—or who posted it—but I can still perform the daily labor of liking my friend’s posts.” Which is exactly what makes the extension so unsettling, and more than the one-liner its description suggests. “I would argue, as you also suggest, that this shows how the design of the Facebook interface is driving much of our daily interaction with it, that it has taught us what to do,” he says.
Is the extension a critique on the social network’s issues with moderation? To my eyes, it implies that the only version of Facebook that’s “safe” is one where no content exists at all. But Grosser adds that the point is that Facebook’s design itself is the driving force behind many of its problems. “From Facebook’s dependence on quantification (to encourage posting, to evaluate popularity, to inform News Feed curation, etc.) to its need for constant growth (continued profit depends on more users, more likes, etc.), the entire system is constructed in a way that will always make possible (even, possibly, encourage?) threats to privacy, health, and democracy,” he explains. “Facebook can try and patch it all they want (with content moderation, News Feed algorithm tweaks, reputation metrics, etc.), but I would argue that any alternative requires a radical transformation, one that isn’t dependent on quantification or endless growth.”
Try it for yourself, if you dare, right here.