“User-friendly” was coined in the late 1970s, when software developers were first designing interfaces that amateurs could use. In those early days, a friendly machine might mean one you could use without having to code.
Forty years later, technology is hyper-optimized to increase the amount of time you spend with it, to collect data about how you use it, and to adapt to engage you even more. Meanwhile, other aspects of our tech have remained difficult to use, from long, confusing privacy policies to a lack of explanation on why and when your data is being collected, much less how it’s being protected. While some aspects of apps and platforms have become almost too easy to use–consider how often Facebook invites you to check out a friend’s latest update or new comment–others remain frustratingly opaque, like, say, the way Facebook crafts advertising to your behavior.
So what does it mean to be friendly to users–er, people–today? Do we need a new way to talk about design that isn’t necessarily friendly, but respectful? I talked to a range of designers about how we got here, and what comes next.
A few years ago, one Google exec claimed that after it tested 41 shades of blue for its Gmail ads in 2009, the company saw a $200 million windfall. It’s a well-worn anecdote now, but it underscores the fact that data analytics are still a new tool in the design world. “Working with data is still relatively new to designers,” says Vinh. “Analytics have been around since the beginning, but they started getting much richer and much more consumable by product designers and the product team 10 years ago or so.”
Being able to make fine-grained observations about the way people use a product ties design directly to engagement. At the same time, insights from psychology have helped designers use behavioral science to increase engagement, too. It’s easy to see the invisible hand of these new tools everywhere, once you start looking. Don’t overwhelm people with settings and menus. Don’t expect them to spend a lot of time reading a privacy policy. Make interfaces slick and fast and usable. Reduce friction. Send regular notifications with rewards. Turn your product into a game. Be their friend.
Meanwhile, a new vanguard of technologists is advocating for designers to step into the fold and lead–including Simply Secure, a three-year-old nonprofit that helps designers, developers, and companies build products that are more secure and more transparent. “For years there was such a huge UX trend toward seamlessness and concealing as much as possible in the interest of making things user-friendly,” says Ame Elliott, Simply Secure’s design director. “Now, as discipline, interaction designers and UX experts have a lot of hard work to do to think about how to expose those seams in appropriate ways.”
In part, Simply Secure’s approach focuses on educating designers themselves about best practices (some of which you can read about here). That means convincing the design community that privacy and security are part of their ambit–that these issues aren’t boring or impossibly complex, but rather are design problems that demand elegant design solutions. For instance, how do you communicate when and how a voice assistant is collecting data about a person? How can design foster trust in an e-commerce site’s security? How can design help people understand the way their products work, and give them the agency to control their own experiences?
Elliott points to an example of transparent UX from WhatsApp: the app’s automatic delivered and read receipts, communicated through blue checkmarks. Love them or hate them, they give you information about the app’s behavior, and they also change your behavior, Elliott points out. That makes them a great model for clear, transparent UX design. “How do you take the simplicity of that check mark for read receipts, and apply that to voice, and apply that to smart cities, and apply that to autonomous vehicles–all these other emerging technologies?” she asks. “How do we give people immediate feedback about how the system is working?”
Simply Secure hosts workshops, offers design support, and offers an online database of best practices for transparency, security, and privacy–but in a broader sense, it wants to push designers at large to think about these issues as fundamental to their jobs. “I want to come forward and say, ‘Hey, UX designers, you have a leadership opportunity here–to step into a deeper role unpacking some of these privacy and ethical issues,'” Elliott adds.
Elsewhere in the industry, designers are grappling with similar ethical questions about engagement and optimization. Emily Ryan, a UX designer who tweets under the handle @UXIsEverywhere, has experienced the dilemma both as a designer and a user. Ryan recently found herself trying out a mobile game–and immediately wasted two hours (and $1.99) playing it on her couch without a second thought. “It was a very strange moment where a light went off,” she remembers. It’s a balance any designer with a brief to design an effective, engaging experience has to strike: “You want people to spend money on your game and you want them to spend time in it, but there comes a point where that can become detrimental to what’s good for them and what’s healthy for them.”
For designers at large technology companies, wanting to do the right thing can present a complex dilemma. Without walking away from a job or a client, how do you reconcile the client’s wishes with your own definition of what “good” design really is?
It’s an uncomfortable position to be in as a UX designer when, as Ryan puts it, “the clock is ticking, and the client is paying, and the product manager needs something done.” In her experience, the best way to shift a client’s perspective is to get specific about what it could cost them. Ryan comes from the cybersecurity world, but after becoming UX competency lead at Deloitte Digital in 2016 she is developing an idea she calls “strategic UX.” It’s a method of proving to a client why a dark UX pattern should be avoided, even if it seems like it’s the right call from their perspective. The key? Making a monetary case against it. “Part of getting a business to make the right decision is to tie that back to money,” she explains. “So instead of saying ‘this is morally wrong,’ it’s ‘Hey, here’s what you should be doing, and it’s just good business to do this. And here’s all of the times when people haven’t done it, and this is what it cost them.'”
Followed by subsequent thanks from other users! The article lost readership bc of #badux #darkUX pic.twitter.com/rSv6BoMA7y
— Emily Ryan (@uxiseverywhere) August 23, 2017
Consider the paginated listicle. You’ve seen them across the internet, forcing you to click through a series of slides to read the article. Ryan points to one recent example on the site IHeartDogs.com, “The 10 Least Obedient Dog Breeds,” which makes you click through a series of slides, each with its own ad, to read the list. On Facebook, the first comment is a reader listing the entire content of the story so no one else has to click through all of the ads. Followed, of course, by other readers thanking them profusely for saving them the trouble. “At the end of the day, if you have a user who’s not happy, they’re going to find this workaround,” she says. “And all of a sudden, they’re hacking their own experience, and your UX is going right out the window and it’s a wasted effort not doing anything good for the client.”
The listicle might seem like an effective and sticky bit of UX for publishers who want to juice traffic and ad impressions. “But until you say, ‘here’s the amount of money you’re losing doing that,’ they’re not going to change it,” Ryan says. She admits that this approach takes longer and is harder than taking the path of least resistance. Making the case still falls on the designer’s shoulders.
What we’re seeing now is just the beginning of a discussion around the ethics of UX–and the uneasy balance between what’s good for a company and what’s good for people will surely evolve alongside technology itself. But one thing seems certain: “Friendly” no longer seems like the right word for describing good digital design that’s transparent, ethical, and respectful of users.