Last night, I swiped my iPhone awake during a commercial. I had no particular destination in mind; it was an absent-minded liminal shift between two of the many screens that constantly, quietly invite me to interact. Without being even cognizant of what I was doing, I tapped the beachy neon hue of the new Instagram icon.
This kind of user behavior is the goal. For Instagram’s design team, I had just provided validation that something as simple as an icon can influence interaction and engagement. It’s one of the most recent and high-profile examples of a much larger sea change in mobile design. For such a tiny, seemingly inconsequential, and largely superficial piece of design, the stakes are surprisingly high: The longer it takes to recognize an icon–however small the lag–the less momentum the user has.
That brief pause between opening your phone and deciding what to do? It’s now the most important millisecond in interaction design. New research helps explain why–and offers clues to improving the user experience.
Scientists have been studying lag since the earliest days of human-computer interaction. In the late 1960s, researchers proposed that there was a strict limit to how long a machine or operating system could take to respond to a user, before that user lost his train of thought.
The general thinking was that two seconds was the upper limit. After that, our short-term memory starts to fail us, and we get tripped up by frustration, or even forget what we were doing. You know the feeling of walking into a room and not knowing what you came for? That’s the enemy. “The longer a content must be held in short-term memory, the greater the chances of forgetting or error,” wrote IBM researcher Robert Miller in his famous 1968 paper about the psychology of lag times in technology.
Miller set out some basic benchmarks for response times that are still in use, even today. For example, users can keep their train of thought if the response time is less than one second. They will perceive it as spontaneous if it’s under .1 second, as Jakob Nielsen has also written. When it comes to your phone, the response time is a relationship between your intent–post something on Instagram–and unlocking your phone, swiping through dozens of icons to find the right one, and waiting for it to load. Whittling down the time it takes you to recognize a logo is one part of that use case designers actually can affect.
Around the same time Miller was writing, conventional graphic designers were working on another new problem: universal wayfinding. The world was being globalized–by the explosion of transcontinental air travel and even cultural events like the Olympics. Cities and countries needed to communicate with a broader range of visitors who might not speak the local language. The solution? Pictograms–a universal language of icons that could replace conventional words with icons. Originally invented in the prewar era by designers like Otto Neurath, the idea of a universal picture language gained new relevance in the increasingly global world.
Pictograms found a natural home in Silicon Valley in the 1970s, with engineers developing the earliest graphical user interfaces. They landed upon a slick idea: Rather than use text to communicate, why not show users a fake, metaphorical “desktop,” where applications and actions were represented by icons instead of words? That way, anyone who’s seen a desk in real life could immediately perceive how to use a digital version. In 1973, the Xerox Alto had the first graphical user interface, complete with file folders, trash cans, disks, and other pictograms.
It’s the same basic visual language we use today, give or take a beveled corner and neon gradient or two.
In the 50 years since those early days, the way we interact with computers has changed immensely. Today, we talk about milliseconds, not seconds.
As our interactions have gotten faster and more frequent, our behavior has subtly changed, too. We are in a near constant state of interaction with our phones, and not always with a specific purpose. We look at our phones when we’re bored, when we’re uncomfortable, when we’re lonely, when we’re overwhelmed, when we’re excited. Ten years ago, we opened our phones to send a message or look at the weather. Today, we open them without a good reason.
So, more than ever, interaction design is a science of margins, millisecond glances, and balancing cognitive load. And unsurprisingly, researchers are increasingly studying the way we use our screens in terms of brain science. Take Instagram’s new logo and icon, which designer Ian Spalter told Co.Design was shaped by asking the company’s employees to draw the icon from memory. “That gave us a sense of what was burned in,” Spalter told Cliff Kuang. Second, the team conducted extensive qualitative testing focused on how well users could recognize the new design.
There’s some scientific evidence supporting that approach. In her paper, “Graphics and Semantics: The Relationship between What Is Seen and What Is Meant in Icon Design,” Sarah Isherwood debunks the idea that an icon or logo is easier and faster to recognize if it has a concrete visual relationship to its function. After people get used to abstract icons, they’re just as good at recognizing those icons. More important? Our familiarity with them.
Other recent studies go even further, suggesting that whether we think an icon is beautiful may affect how quickly and easily we identify it. Whether a logo or icon appeals to you–whether you enjoy it–is an overlooked aspect of design that’s only now being recognized. We’re good at using beautiful things–or is it that we perceive things that are easy to use as beautiful?
If familiarity is so important to how we use interface elements like icons, that helps to explain why users get so upset when one changes. Uber’s design team saw this firsthand in the spring, when a major design overhaul elicited shock and horror from thousands of users who, apparently, had a deep emotional connection to the Uber icon. But according to Uber, the engagement numbers didn’t suffer in the long run: It was simply a matter of users semantically connecting the new design with the old function, a kind of “muscle memory” that can take a few weeks to retrain.
The same seems likely to be true for Instagram’s new icon. “As evident by the outrage the Instagram icon change has kicked up, people have very personal connections to the glowing squares on their phones,” says Danish designer Michael Flarup, who runs an icon design site.
But are there specific design elements that can increase user performance? It certainly seems so. A recent study from University of Michigan researchers tested what type of iconography elicited quick reactions from drivers. Using eye tracking, they tested how different signage designs–like kids playing in a street, or wet floods–were interpreted by drivers. Across the board, drivers reacted much more quickly to iconography that depicted dynamic movement rather than static figures, suggesting that icons that look like they’re in motion tend to “prepare us for action.”
You can find dynamic elements all over your home screen; a good example is iOS weather apps, since they all must compete for attention with Apple’s own. Look no further than DarkSky, one of the more popular weather apps in the App store, to find dynamic iconography. The localized weather app uses a lightning bolt in mid-streak across the sky, which actually feels like it’s in movement. (DarkSky’s creators, for their part, says the effect was a product of “going through a million iterations until it was simple and not ugly.”) Compared to the static cloud and sun of iOS’s weather app, it feels fresh, relevant–almost imminent.
Interesting though these studies are, they’re just the tip of the iceberg when it comes to truly understanding the way human brains interact with interfaces. As more designers use scientific methods, we’re going see the study of design emerge in earnest.
In 2014, the Association for Information Science and Technology published a paper called “How Are Icons Processed by the Brain? Neuroimaging Measures of Four Types of Visual Stimuli Used in Information Systems”. In it, three University of Texas researchers describe how they set out to discover something seemingly very simple: Was it easier for human brains to analyze images or words?
Using MRIs, they were able to watch as subjects’ brains encountered iconography and text, and see what parts of their gray matter got called up for the job. Oddly enough, it turned out that understanding a picture or icon requires a lot more of your brain’s resources than just interpreting a word. This could have a “profound” impact on UI, the authors conclude; after all, if reading text requires far less mental effort, are icons really as useful as we think? Or are text-based interfaces–like conversational UI or voice-controlled UI–a better way to deal with cognitive load in users?
The results prove that we’re about to enter a new era for design as tools like neural imaging, eye tracking, and cognitive psychology become more common in design studios and technology companies. But who knew the humble app icon would be its avant-garde?