Last week, Strava released 1 billion user activities on a global heatmap, inadvertently revealing secret humanitarian and military base locations by showing exercise activity in countries like Afghanistan, Syria, and Niger.
It’s a security crisis and a PR nightmare, but Strava isn’t apologizing and the company appears to be keeping the data public (though it has reportedly disabled some features). Its attitude seems to be that people have used its platform to upload workouts publicly for nine years: Why would people be furious now?
Perhaps because, although Strava bills itself as the “Social Network for Athletes,” its sign-on experience gives users little indication they will be sharing their location publicly. The seamlessness of the sign-on experience means that some users have no idea they’re sharing their location publicly. People who would have opted out of sharing never did, because they never knew to check their settings. It’s a classic case of user-friendly design being too user-friendly.
Often companies making sharing the default, because they know users have limited attention and want a good user experience. The hope is that users who want privacy will figure it out before they share too much. (Of course if that revelation never comes, users feel betrayed.) Therein lies the tension in any application that collects data. How can we design social products that are simple to use but still explain complex privacy rules? Design should surface critical information about how much of a user’s data is visible to the public in a timely way.
Terms of Service pages are the wrong way to convey privacy expectations
It’s not a secret—most users don’t read them. Pinterest has a beautiful Terms of Service page, with plain language summarizing each section. But plain language is not why people know that Pinterest is a social site. The first page a user sees on Pinterest is a smattering of images and the people who shared them. Pinterest doesn’t tell you how your data will be used, it shows you.
Compare that to Pocket, a plug-in app that allows you to save interesting articles you want to view later. Your saved list is private, but there’s nothing on the front page that talks about privacy. Users assume it’s private because that is a perfectly normal thing to expect from an app, or any product. Privacy should be the expectation, not the exception.
Strava users can create an account and record their exercise without ever encountering visual or text indicators that users’ data will be shared with the public. Strava doesn’t set privacy expectations, so it’s no surprise that some users would behave as if their content is private.
How can designers set expectations about privacy but not complicate the process?
There are two main design factors to consider. First, does your design show novices how to use a feature automatically or does it have to tell them how with words and walkthroughs? Second, is information about a feature shown as part of its use in a timely way, or tucked away in a help page?
Strava users can create an account and record their exercise without ever encountering visual or text indicators that users’ data will be shared with the public. Strava didn’t set privacy expectations, so it’s no surprise that some users would behave as if their content is private.
Not every feature can be part of an app’s core real estate. You wouldn’t feature a chat interface on the front page about logging your exercise. Some content must get pushed into side pages or the settings menu. Let’s look at the applications that do it well.
Google Arts & Culture lets you upload a selfie and match it with famous art. To upload your selfie to a faceless app is terrifying. So when a user first accesses “Search with your selfie” the app tackles that fear up front with a clear promise to not store or misuse your picture.
Almost everything on Twitter is public by default and Twitter makes it clear. Twitter doesn’t dedicate a lot of real estate to privacy hints. The home page’s pitch is “Follow . . . “, “Hear . . . ” and “Join . . .”–all of which indicate a sense of social sharing. Feeds are filled with people’s words and faces. Everything on Twitter is designed to remind you this is all public (though users sometime forget just how public it can be).
Facebook is an interesting case because of its complexity. Different things get shared at different levels: only me, friends, friends of friends, or the public. Every post you see has an icon that indicates how widely it was shared. There are periodic privacy checkups to update your settings. The company is constantly showing you reminders about your privacy settings as you post.
Of course, if companies are evil and only want users to overshare data to make more money, they will go right on hiding information about sharing in the darkest depths of their settings menus. Intentional bad behavior will only stop when companies lose enough money because of bad press, activism, or effective regulation.
If default sharing is a side feature, people will miss it. Don’t assume a user’s assumptions about privacy are the same as yours. If you set users’ expectations with a clear design that surfaces what’s important, you can avoid betraying the trust of a user—or a military base full of users.
Stephanie Nguyen is a user experience designer and researcher and was most recently a Digital Service Expert at U.S. Digital Service at the White House. She’s currently a Masters in Public Policy candidate at Harvard Kennedy School and a Gleitsman Fellow for social change at the Center for Public Leadership. @nguyenist.
Brian Lefler was a software engineer at Google Maps, Google Ads, and Amazon before becoming a founding member of the U.S. Digital Service. Right now, he’s at Harvard Kennedy School as a Robert C. Seamans Jr. Fellow in science, technology, and public policy and is a Masters in Public Administration candidate. @bclefler.