AI can offer amazing user experiences when it’s well designed–from more effective spam filters to digital assistants that understand the nuances of your voice. But according to interaction designer and Carnegie Mellon professor John Zimmerman, many UX designers are utterly unprepared to design this new wave of AI-centered interfaces.
Zimmerman has designed intelligent systems for 20 years–everything from a TV recommendation system for Philips to a system designed to sense depression and an interface for an algorithm that helps cardiologists decide whether or not to perform heart surgery. He believes there’s a gaping chasm between AI and UX. “UX designers right now go out and do a bunch of field work, but they fail to see opportunities where machine learning can add value,” he said at the Google People + AI Research Symposium in Cambridge, Massachusetts, this month.
There are many reasons for that. For one, the technology itself is very complex, limiting designers’ ability to play with it or even gain a tacit understanding of it. Secondly, machine learning isn’t part of a standard design education. Nor is it included in many mainstream design tools. Zimmerman believes this has led to a gap in UX designers’ skill sets. Yet it’s important that designers think of AI as just another tool in their toolbox–a material to be used responsibly and ethically.
Where’s The Machine Learning Playground?
“Design generally evolves new ideas through a conversation with materials, where you develop a tacit understanding of the material’s capabilities,” Zimmerman tells Co.Design. “This is very hard for designers to do with software, but it’s particularly hard with machine learning.”
Zimmerman points to Ray and Charles Eames’s furniture breakthroughs to demonstrate how designers need to play with and dissect materials to fully understand them. The duo’s material of choice was plywood–they were so obsessed with it that they made it themselves. “Through that process they came up with this entirely new idea for furniture,” he says. “They found a super inexpensive way to manufacture furniture that had a very different look. But it came from playing with the material. Traditionally we train design students by sending them to the shop and studio, to cut paper and play with plastic. We don’t have a machine learning shop.”
In other words, it’s really hard for designers to experiment with machine learning because the technical barriers to entry are still so high. The Eameses didn’t need to be chemical engineers to play around with plywood–but to play with machine learning, you often need a deep understanding of math, data, and statistics.
Some companies, Google included, are working on this problem by trying to create programs that automate the behind-the-scenes process of building a machine learning model, which can require a PhD to complete. But they’re not yet complete, and they could ultimately mean ceding design decisions to the company that built them.
But in the meantime, Zimmerman thinks a shift in mind-set is in order. One of the most obvious examples of how machine learning could be used in UX is through adaptivity, or products that learn how a person uses them and then change to accommodate that person. For instance, an adaptive, machine learning-powered UI would learn if you always use the Starbucks app to pay for your coffee and automatically pull up that screen when you’re inside a Starbucks. Companies like Zappos could choose to fill in your shoe size when you’re shopping on the website, or only show you styles in your size. Adaptability isn’t thought of as a standard design element yet, though.
Zimmerman is the first to admit that thinking about how a system learns over time is not part of his own personal design process. For instance, he’s working on a crowdsourced transit mapping project called Tiramisu, which was not initially designed to learn from users’ behavior. Zimmerman says the idea didn’t even occur to him until a new PhD student on the team asked him about it. “It was a super obvious question,” he says. “It made me think about how we don’t even think about [making something adaptive] when we’re doing sketches and wireframing. I’ve been reading about this since the mid ’90s. But it’s not even in the mind-set to say, is this [interface] going to learn?”
Part of the problem is that designers don’t necessarily think about adaptability. But there are also myriad new challenges that come with building machine learning into products. For instance, one of the central concepts Zimmerman is currently thinking through is the idea of an “undo” button. Take the Zappos example. What if you’re shopping for shoes for someone other than yourself, and the site is only showing you shoes in your size? There needs to be a way out–an “undo”–so users can get back to a generic version of the interface. How to achieve that effectively is an open question facing UX designers today.
Bringing AI To Education And Design Tools
Zimmerman believes there are ways to help designers prepare for the future. The first lies in design education. “The lowest bar is to simply give students assignments where they need to design an intelligent system and a part of their design is to envision how the intelligence is working below the hood and how the user is going to interact with that,” he says. “That’s the simplest thing: to let them know, people are going to be asking this of you.”
He also proposes curricula that pair design students with data science students, so they get the opportunity to work with someone making models and have the opportunity to experience the kinds of problems machine learning can solve. It’s a plus for data science students, too–they get a lens into the design process and understand how the work they’re doing affects real people.
Outside of education, Zimmerman thinks common design software itself isn’t offering designers the tools they need to prototype and plan adaptive interfaces. “They certainly encourage you to think about navigation. They scaffold you in all kinds of ways,” Zimmerman says. “Why are we not also embedding in those tools the very simplest issues of learning and adaptation to people’s repeated use?”
While the concept of “interfaces that learn” isn’t new, the fact that it hasn’t broken into mainstream design tools means it isn’t really part of designers’ tool kits yet. “There’s a huge opportunity for Sketch, Adobe, or whoever who wants to be the next big tool, to support designers in thinking this through,” Zimmerman says.
Are designers as far behind as Zimmerman thinks? And how can we close the gap between AI and UX? Let us know what you think by emailing us at CoDTips@fastcompany.com.