On August 2, 2018, Apple became the world’s first public company worth more than $1 trillion. If anything, that abstract figure understates the company’s reach. Apple makes the first thing that hundreds of millions of people look at when they wake up. The company’s supply chain can extract trace amounts of rare-earth minerals from a mine in the Democratic Republic of the Congo, embed them in one of the planet’s most advanced computers, and deliver the whole thing to the steppes of Mongolia. And yet Apple’s rise is nothing more or less than the story of three interfaces: the Macintosh OS, the iPod click wheel, and the iPhone touchscreen. Everything else has been fighting about how the pie would be divided up among competitors and copycats.
In the user-friendly world, interfaces make empires: IBM, with its punch-card mainframes, was an empire until the 1970s. Then came the graphical user interface, which transformed both Apple and Microsoft from niche companies into mainstream Goliaths. (In April 2019, Microsoft became the third company in the world to reach a $1 trillion valuation, right behind Amazon.) Apple, of course, nearly died in the late 1990s; a major part of what saved the company in the years after Steve Jobs returned was the iPod’s click wheel, which cracked the problem of making it fun to browse incredibly long lists. Blackberry, with its telephone lashed to a keyboard, was another empire until the iPhone. Even Amazon grew from an interface idea: 1-click shopping.
The value of the patent alone is staggering: Amazon made billions licensing it to Apple for the launch of the iTunes store. But its value to Amazon has been far greater. By eliminating all the check-out steps required to buy something online, 1-click gave Amazon a decisive edge against cart abandonment, which, according to some studies, averages 70 percent and remains one of the two or three biggest challenges to online retailers. 1-click made impulse buys on the web actually impulsive. The boost from 1-click shopping has been estimated to be as high as 5 percent of Amazon’s total sales—an astonishing figure, given that Amazon’s operations margin hovers below 2 percent. Moreover, it also incentivized Amazon’s customers to stay logged in to Amazon at all times—which then allowed Amazon to silently build up a user profile in its databases, which then allowed Amazon to become a platform for selling and recommending anything, not just books. Amazon’s 1-click would easily be the single most consequential button ever invented, but for the Facebook Like button.
Apple’s two great innovations, the graphical user interface and the touchscreen, are cousins united by a deeper vein of metaphor. The Macintosh OS got its user-friendliness from the intuitive physics of its interactions, which sprang from trying to create interactions that were natural exactly because they were borrowed from our intuitions about the physical world. The bridge was the desktop metaphor. The touchscreen wasn’t so much a new metaphor as it was a better input device. First, the mouse cursor stood in for your hand, when the world was a screen. Then the mouse cursor disappeared, when the screen itself could sense your touch. The iPhone wasn’t a break from the Mac, so much as its fulfillment.
It may seem strange to say that the iPhone inherited its logic from the desktop computer, especially if you didn’t grow up using a mouse. But it’s there: the way you tap an app to open it; how you can drag apps around the home screen; the idea of an app itself, able to deliver email or calendar appointments or news; the back button and the close button. Yet all this logic exists quietly. We don’t notice the desktop metaphor anymore because we no longer need it to explain how we’re supposed to use a modern computer.
That’s how metaphors work: Once their underlying logic becomes manifest, we forget that they were ever there. No one remembers that before the steering wheel in a car, there were tillers, and that tillers made for a natural comparison when no one drove cars and far more people had piloted a boat. The metaphor disappeared once driving cars became common. In digesting new technologies, we climb a ladder of metaphors, and each rung helps us step up to the next. Our prior assumptions lend us confidence about how a new technology works. Over time, we find ourselves farther and farther from the rungs we started with, so that we eventually leave them behind, like so many tiller-inspired steering wheels. Or like the various metaphors—hyperlink, browser, search engine—that taught Westerners how to use the World Wide Web.
But metaphors don’t last forever—and the story of how they break down is the story of why so much technology that once seemed magical slowly becomes painful. We can watch the breakdown of one metaphor, created by Apple, from our own phones. Throughout the mid-2000s, the company was lambasted in the design community for its skeuomorphs, which the Oxford English Dictionary defines as “an element of a graphic user interface which mimics a physical object.” These had started out usefully, but over the decades reached a pointless level of detail. At one time, it was important for a file “folder” to indeed look like a folder, so that you knew it did the same thing. By the mid-2000s the details had gotten baroque. To know how the calendar worked, you didn’t need the calendar on every Mac to look as if it had been bound by stitched leather; to know that you could buy books via the iBooks app, there didn’t need to be digital shelves, made of digitally rendered wood.
The design community’s bias against skeuomorphism had descended from the Bauhaus, which, at the dawn of modern design, declared a break with tradition by decrying decorative flourishes meant to link the new world with the old—for example, the Art Nouveau metalwork of the Paris Métro entrance, where copper was fashioned to look like ornate vines. The Bauhaus was born of the idea that materials should do exactly what they were suited to, and only that. Marcel Breuer’s famed metal-framed club chairs supported the sitter using a novel cantilevered frame made of steel—and that steel was chromed to highlight the fact that only metal could accomplish the feat. In the context of computers, what had once been a necessary feature to make things user friendly—fidelity to the real world—had descended into a kind of dishonesty. Should pixels look like metal and wood if they’re not in fact metal and wood?
It wasn’t a surprise that Jony Ive—an industrial designer by training, weaned on a faith in materials, the designer of the candy-colored iMac, then the iPhone—would hew to a faith in material honesty. When Ive took over software design at Apple in 2013, he introduced a clean new language for the iPhone’s operating system. At the time, this was trumpeted as proof that Ive’s good taste had finally won out over the company’s ideologues, such as Scott Forstall, who’d overseen the iOS for years and remained slavishly devoted to the personal tastes of Steve Jobs, who had died two years before.
But what actually happened was simply that Apple’s founding metaphors, which had been handholds for a nervous migration to the digital world, were now irrelevant. You didn’t need the calendar on your iPhone to look like the one on your desk, if, like most people, you’d already discarded the one on your desk because of the iPhone. The rule for metaphor in design is fake it till you make it. Apple had made it, after faking it for so many years.
There are stakes for the companies that create these metaphors, and stakes for us, the people who live with them. As Apple’s visual metaphors started to age into incoherence, its underlying metaphors started to break down as well, making our digital lives ever more confusing. When Apple unveiled the App Store in 2008, no one was certain how big it could become. Initially, there were around five hundred apps available. In later years, we’d see the explosion of the so-called app economy, and the sudden dominance of mobile computing. What’s never asked is why the App Store made sense to users, and how those initial assumptions shaped what followed. There was a metaphor underlying it all.
All the way up until the late nineteenth century, stores worked very differently than they do today. The goods were placed behind the counter on shelves or under glass. Shoppers on the high streets of Paris and London were typically upper-class, and if they wanted to see something, they had to ask the shopkeeper to get it for them. It was up to the shopkeeper to explain the story of a product. This changed by the turn of the century, thanks to pioneers such as Harry Gordon Selfridge. Beginning at Marshall Field’s in Chicago, Selfridge experimented with a retail concept that the world had never seen, in which the goods didn’t sit behind the counter. Alone on a shelf, the goods had to sell themselves.
A century later, this remains the standard in stores across the planet. It’s how software was sold in the Apple Store when it opened in 2001, in boxes laid out one next to the other. By the time the App Store came along, it made sense that it would look much like those open shelves. But in selling apps like that, right on a smartphone, the inevitable implication was that apps were stand-alone goods—things like Microsoft Word, which you used for a specific purpose.
As the app economy grew, this assumption started to crack. When you think to yourself to arrange a dinner date with friends, you have to text them, find a restaurant, text them again, find a time, look for a reservation, agree on the reservation, mark it in your calendar. It’s maddening. A technology that once seemed so easy now seems like it is making us do its bidding.
The only reason those frustrations exist is that the metaphor that begot the app economy was the wrong one. Underlying the structure of all the apps we use is the internet, and its infinite web of connections. But we consume apps through the metaphor of the store, through the assumption of stand-alone goods that we use one at a time, rather than in a web of references.
Those two paradigms conflict—and they often only barely line up with the ultimate purpose our phones are meant to serve, that of keeping what we care about and whom we care about within reach. As a result, smartphones put the burden of piecing things together upon us. Resolving them will require a new metaphor for how smartphones work, and when someone finds it, our digital lives will evolve. Imagine if instead of apps, our smartphones were built around the relationships we care about—if, instead of opening an app to connect with who we love, we simply remained connected with those we loved, and the tools to bring us closer appeared only when we needed them, in the flow of our relationships with one another. Who knows how much easier, how much more satisfying, our digital lives might be if the governing metaphor for smartphones were one of human connection, rather than programs.
Excerpted from User Friendly: How the Hidden Rules of Design Are Changing the Way We Live, Work, and Play, by Cliff Kuang with Robert Fabricant. Published by MCD, an imprint of Farrar, Straus and Giroux November 19th 2019. Copyright © 2019 by Cliff Kuang and Robert Fabricant. All rights reserved