I think I broke my Facebook.
That might sound like something your Luddite aunt would say, but I’m being serious. It started about two years ago, when, in a fit of annoyance at all the baby pictures flooding my news feed, I systematically unfollowed every single person and organization in my network except the actual news outlets. That promptly turned my sprawling social network of friends, frenemies, and strangers into a mere news reader plugged into just a half-dozen publications. Problem solved! No more updates about people’s lives.
Two years later, this seems like a grave mistake. I find myself curious about what people are doing. I’m falling behind in real-life conversations about what’s happening with friends. Put another way, it’s literally impossible for me to use Facebook for its original purpose. There’s a follow-on effect that I didn’t realize either: If you unfollow people on Facebook, you drop out of their Facebook feed as well. So now, whenever I have something I really want to share–a new job, or the final draft of the book I’ve been writing for years–I’m met with crickets. I’m stranded on the digital equivalent of a deserted island.
There’s no obvious way to get off this island. I could manually re-follow everyone I unfollowed. But even if I do that, I have no idea if Facebook automatically makes them follow me. For all intents and purposes, my Facebook is ruined. And I suspect that over time, you’re ruining yours without even realizing it.
This is a problem that Facebook hasn’t acknowledged directly, and it’s even worse on Twitter, where following people eventually makes your feed into an unruly mess. This dilemma will only grow as other services increasingly lean on passive choices to shape the user experience. I call it dead-end UX. It’s what happens when user-preferences build up over time, painting you into a corner you cannot get out of. In a world where personalization algorithms rule everything we experience, from the shopping recommendations we get to the people we hear from, dead-end UX may be one of the deepest problems with digital experience.
We’re Caught In The Dreary Valley
The media has been panicking for a couple years now about the idea that Facebook creates information bubbles: cozy places where we hear from only the people who think just like us, giving us little to no exposure to opposing viewpoints. But lost in that conversation has been a look at how affirmation bias happens from the point of view of Facebook. Let’s roll the clock back to the very first “Like” you record on the platform. Using that little bit of data, along with whatever else is available, Facebook tries to extrapolate what else you might like. Its entire view of you is filtered through that single data point. But if you go on to Like dozens and then hundreds of other things that it throws in front of you, it starts generating a stronger and stronger hypothesis of what you’ll react to.
Of course, the problem here is that the process feeds on itself, creating a feedback loop in which it is impossible for Facebook to recognize you as anyone else other than the stream of Likes you’ve created. Data scientists have a couple different concepts to describe these effects. One is the “local maximum,” which you can think of as picking out your favorite dish from a local restaurant. Sure, that dish might be the tastiest thing in that restaurant, but who’s to say that there aren’t other restaurants that you don’t even know about, with even better dishes on offer? With Facebook, there is no obvious way to tell if we’re in the actual best restaurant–or if we happened to end up in that restaurant just because we made a couple arbitrary choices so long ago that we don’t remember. One problem with that is that we change over time. It’s completely unclear if Facebook’s view of us changes along with us, though I’m sure Facebook’s data scientists would slap their heads in frustration and say of course they do. Yet even if they do, you and I cannot tell. Call it the Dreary Valley, where everything started off lovely enough, and then everything started to look the same.
Two Ways to Solve Dead-End UX, For Facebook And Beyond
Local maximums are the inevitable result when algorithms optimize without a bigger view on what should be happening or what you want to be happening over all. It’s what happens when you look just a couple steps forward and back–not 10 steps ahead into the future. While it’s true that affirmation bias is a profound part of our psychology, it’s also true that human beings have higher order values that can overrule our lizard brains. We can believe in things such as freedom of speech even if we’d like our neighbor to shut the hell up about #MAGA.
One problem with Facebook is that it is evolving little features that basically amount to settings that will never be very successful because they are, by nature, small. To solve the problem of discovering stories you’d otherwise miss, Facebook unveiled an “Explore” button buried midway down a giant menu. So solve the fake news problem, it created a tiny little “i” button offering context about a source. But you can’t solve foundational issues with tiny features. To solve a problem like dead-end UX, you have to rethink the service in a bigger way.
I have two modest proposals for how Facebook might do this. The first is easy. The second is hard.
The first would be to create a reset button for your Facebook experience. The idea of being able to undo an action is probably the oldest concept in human-machine interaction–it is one of the first principles that was codified at the dawn of the discipline in the 1950s. For every action a user takes, they should be able to undo that action. Facebook of course lets you undo tiny actions that you take–you can undo a Like, and you can undo a friend request. The designers knew better than to not include such a basic feature. But there is no such thing as a macro-level undo button.
You can think of Facebook as a bunch of tiny buttons that you’re asked to push all the time. But you can also think of Facebook itself as one giant button marked “This Is Who I Am!” Just like all those tiny buttons have an undo function, that giant button should have an undo feature as well. You should be able to restart your Facebook experience. You should be able to hit a button that takes the algorithms determining what you see back to zero, so that you can try to retrain Facebook anew while leaving your friend networks intact.
The second thing that doesn’t yet exist but that should is the ability to set your Facebook experience not just by the tiny things you like, but the broad experience that you’d like to have. Imagine if Facebook were like a concierge, who asked you what kind of information diet you’d like to have. You might tell that concierge, “Tell me everything that I need to know about the world, and tell me lots about the few friends I care about, and only major life events from everyone else.” Or you might say, “Keep away from politics and give me a ton of stuff that’s inspiring or adorable.”
Facebook’s algorithms supposedly work like this. The problem is that Facebook is the only entity capable of deciding how well it’s doing that job. In the future, machine-learning should allow what we want at a high level to be translated into qualities that ranking algorithms can parse. The coming wave of AI-driven user experience needs to let us, the users, set our higher-order preferences. It needs to let us express not just preferences at the level of who to follow and who to mute–but rather, experience preferences that get at what we really want.
To be clear, I have no idea if Facebook is thinking along these lines or not. As it has grown, it’s shown less and less willingness to talk about what it thinks. But the problems I’m outlining here reach beyond Facebook. So do the solutions. Dead-end UX is what happens when any service purports to know who you are and what you want based on a scant few signals. In an era in which all digital experiences will be customized to us to some extent, we need to recognize dead-end UX as a foundational problem worthy of solving. We will need ways to hit reset and express the values that we want a digital experience to uphold.