Skip
Current Issue
This Month's Print Issue

Follow Fast Company

We’ll come to you.

8 minute read

Masters of Design 2011

What Can Steve Jobs Still Teach Us?

Apple's Leader has died at the age of 56, having recently stepped down as Apple's CEO. He wasn't trained as a designer or an engineer. But he was one of the greatest users of technology ever. That was his secret asset.

What Can Steve Jobs Still Teach Us?

In the wake of Steve Jobs's resignation [Ed. Note: And now, death], let's consider the greatest decision he ever made. It didn't happen in a garage in Cupertino, California, sweating with Steve Wozniak as they dreamed up a computer for the common man. Or in a conference room, as managers told him that no one would ever pay $400 for a portable music player. Or in another conference room, as new managers told him no one would ever pay $400 for a cell phone. Rather it was in an almost forgotten annex on the Apple campus.

Jobs had recently come back to the company after a 12-year hiatus working for two of his own startups: NeXT, which made ultra-high-end computers, and Pixar. He was taking a tour of Apple, becoming reacquainted with what the company had become since he'd left. It must have been a sobering, even ugly, sight—Apple was dying at the hands of Microsoft, IBM, Dell, and other competitors that were doing what Apple did, only cheaper and with faster processors.

In a dusty basement across the road from Apple's main building, Jobs found a solitary designer who was ready to quit, languishing amid a stack of prototypes. Among them was a monolithic monitor with a teardrop swoop, which integrated all of a computer's guts into a single package. And in that room, Jobs saw what middle managers did not. He saw the future. Almost immediately, he told the designer, Jonathan Ive, that from here on out they'd be working side by side on a new line of computers.

Jobs may not be the greatest technologist or engineer of his generation. But he is perhaps the greatest user of technology to ever live, and it was to Apple's great fortune that he also happened to be the company's founder.

Those computers that Ive and Jobs worked on became, of course, the iMac—a piece of hardware designed with an unprecedented user focus, all the way to the handle on top, which made it easy to pull out of the box. ("That's the great thing about handles," Ive told Fast Company in 1999. "You know what they're used for.") That single moment in the basement with Ive says a great deal about what made Jobs the most influential innovator of our time. It shows an ability to see a company from the outside, rather than inside as a line manager. He didn't see the proto iMac as a liability or a curiosity. He saw something that was simply better than what had preceded it, and he was willing to bet on that instinct. That required an ability to think first and foremost as someone who lives with technology rather than produces it.

People often say that Jobs is a great explainer of technology—a charismatic, plainspoken salesman who is able to bend those around him into a "reality distortion field." But his plainspokenness had force because he always talked about how wondrous it would be to use something, to actually live with it and hold it in your hands. If you listen to Jobs's presentations over the years, he comes across not as the creator of a product so much as its very first fan—the first person to digest its possibilities.

It's when Jobs has fancied himself the chief creator, rather than first fan, that Apple has stumbled. His much ballyhooed Apple Cube, which was in fact a successor to the NeXT cube he'd developed, was an $1,800 dud. Even before his hiatus from Apple, in 1985, his meddling and micromanagement had gotten out of control. But the years away reportedly helped him begin ceding more responsibilities to others. He became less enamored of tech for tech's sake. He blossomed into a user-experience savant. A reporter who asked Jobs about the market research that went into the iPad was famously told, "None. It's not the consumers' job to know what they want." It's not that Jobs doesn't think like a consumer—he just thinks like one standing in the near future, not in the recent past. He is a focus group of one, the ideal Apple customer, two years out. As he told Inc. magazine in 1989, "You can't just ask customers what they want and then try to give that to them. By the time you get it built, they'll want something new."

Jobs has been criticized for exhibiting a ruthless and arbitrary perfectionism, scrapping a product because it didn't feel right, because some minor feature like a power button or a home screen seemed unresolved. (He notoriously tore through three prototypes of the iPhone in 2007 before finally giving the okay; he berated Ive over the details of the USB port in the first iMac.) But that interpretation is unsophisticated. A myopic focus on details can destroy as much value as it creates. (Think about how often you've sat through a meeting with a boss who harped on details, killing an idea before you had a chance to explain what it could be.) Jobs certainly did not destroy value. True, he killed far more ideas than he let live—there are more than 300 patents under his name covering everything from packaging to user interfaces. But those that survive outweigh all the rest. His focus was, continually, on what it would be like to come at a product raw, with no coaching or presentation but simply as a new, untested thing.

The most obvious example of this hides in plain sight and is a fundamental part of every Apple product. From the 1970s to the 1990s, if you opened up a new gadget, the first thing you faced was figuring out how the damn thing worked. You'd have to wade through piles of instruction manuals written in an engineer's alien English. But a funny thing happened with the iMac: Every year, Apple's instruction manuals grew thinner and thinner, until finally, today, there are no instruction manuals at all. The assumption is that you'll be able to tear open the box and immediately start playing with your new toy. Just watch a 3-year-old with an iPad. You're seeing a toddler intuit the workings of one of the most advanced pieces of engineering on the planet. At almost no time in history has that been possible. It certainly wasn't when the first home computers were introduced, or the first TV remote controls, or the first radios. And it was something Jobs was driving for his entire career. Again, from 1989, Inc. asked him, "Do you sometimes marvel at the effect you've had on people's lives?" Jobs said: "There are some moments. I was in an elementary school just this morning, and they still had a bunch of Apple IIs, and I was kind of looking over their shoulders. Then I get letters from people about the Mac, saying, 'I never thought I could use a computer before I tried this one.'"

A decisive factor that aided Steve Jobs was fortuitous timing. He came of age just in time to become a founding father of the personal-computer movement. And he was still young enough when he returned to Apple, in 1997, that his own instinctive sense of what a computer might become could be brought to life. In the 1980s and 1990s, computers were sold on their speed and technical capabilities. But by 2000, these features had largely become commoditized—it no longer mattered how fast a computer was when basic issues of usability and integration became paramount. What did speed matter if you didn't know what all the menus meant, or if you were hit with pop-up errors every time you clicked your mouse?

Before 1997, Jobs was ahead of his time: The computers he made were overpriced for the market, because he thought that usability was more important than capability. But as computers reached maturity and became a staple in every home, his obsessions became more relevant to the market. Indeed, many of Apple's recent signature products, such as the iPad or the iPhone, were ideas first conceived in the 1990s or even the 1980s—they had to bide their time.

Jobs is ahead of his time in other ways too: He has taught his entire organization to play in the span of product generations rather than product introductions. Apple designers say that now, each design they create has to be presented alongside a mock-up of how that design might evolve in the second or third generation. That should ensure Apple's continued success for a long time, aided, of course, by the tremendous momentum that Jobs's leadership has provided the company.

It's not clear that anyone else at Apple will possess Jobs's same talent for looking at Apple's products from the outside view of a user. Tim Cook, his anointed successor, proved his worth by revamping Apple's production processes and supply chain. That talent is vital to running the business and has increased Apple's profits by untold billions. But being able to break apart the nuances of sourcing is the exact opposite of being a usability genius: Cook's career has largely been spent focusing on precisely those things the consumer never sees.

Does Cook have an in-house product critic who could stand in Jobs's place? Will Cook have as close a working relationship with Ive as Jobs did? Will Ive even stay? And did Jobs create an entire organization that reflects his balance of concerns—for the back end, yes, but for usability first and foremost? The biggest risk is that Apple takes for granted that its superior design will forever demand a price premium. That might lull it into thinking that Apple itself is great, rather than its products. But Apple, all along, has only been as good as its last "insanely great" thing.

"What would Steve do?" has long been a mantra at Apple (albeit often unspoken). No doubt his example and presence will persist in the organization. The world's most valuable company has one man's vision at its core, in roots that go back 30 years. The unanswered question for Apple is: Who's dreaming its future now?


Check out Design Crazy: Good Looks, Hot Tempers, and True Genius at Apple, our captivating oral history of the company that "taught the world taste." The ebook is available through Apple, Amazon and Byliner.

Photo: Paul Chinn/San Francisco Chronicle/Corbisn

A version of this article appeared in the October 2011 issue of Fast Company magazine.

loading