Everyone knows Parisians are snobs. So it probably shouldn't have come as a surprise that an unshaven, middle-aged American, speaking English and dressed in cuffed jeans, sneakers, and a worn black T-shirt, was rudely turned away from the bar at a lavish fete inside Paris's Musee d'Orsay on September 16, 2003.
Except that the man was Steven P. Jobs, the cofounder and chief executive of Apple Computer Inc., and it was his party. And some bash it was. For three hours, Apple's guests grazed on foie gras and seared tuna canapes, and sipped champagne while strolling under a massive glass arcade that shelters one of the world's largest collections of Impressionist masters, Rodin sculpture, and art nouveau furniture. In a Baroque salon at the far end of the museum, a raucous jazz band played. As one guest observing the scene intoned, "This is huge."
Not huge enough, it seems, to make room for Jobs. But if the boss was peeved at getting the bum's rush, he didn't show it. Together with his entourage of suited computer executives, Jobs retreated quietly to a bar on a lower level, and the party celebrating the 20th anniversary of Apple's European trade show, Apple Expo, proceeded without further incident. Maybe "Bad Steve" has mellowed at the age of 48.
Then again, maybe Jobs has just gotten used to being tossed out of his own parties. You could say that the personal-computer industry itself began as an Apple wingding when the Cupertino, California-based company introduced the Apple II in 1977. Ever since, Apple has played the role of generous host, spicing up the festivities with one tasty offering after another. Following the PC, Apple served up many of the features that computer users have since come to take for granted, including the graphical user interface, the mouse, the laser printer, and the color monitor. Yet Apple has been forced to watch the celebrations from out in the alley, its nose pressed longingly to the window as others feast: Today, more than a quarter-century after its founding, it commands just 2% of the $180 billion worldwide market for PCs. Almost everyone agrees that Apple's products are not only trailblazers but also easier to use, often more powerful, and always more elegant than those of its rivals. Yet those rivals have followed its creative leads and snatched for themselves the profits and scale that continually elude Apple's grasp.
All of which raises some interesting questions. If Apple is really the brains of the industry—if its products are so much better than Microsoft's or Dell's or IBM's or Hewlett-Packard's—then why is the company so damned small? (Consider that in the last 10 years alone, Apple has been issued 1,300 patents, almost one-and-a- half times as many as Dell and half as many as Microsoft—which earns 145 times as much money.)
TRUTH IS, SOME OF THE MOST INNOVATIVE INSTITUTIONS IN THE HISTORY OF AMERICAN BUSINESS HAVE BEEN COLOSSAL FAILURES.
The Creativity Conundrum
Conventional wisdom has long answered that Apple is the victim of a single, huge strategic error: the decision in the 1970s not to license its operating system. But that was long ago and far away. Apple has since had many opportunities to reverse its infamous decision, but it hasn't done so. And Apple's creativity has produced plenty of other opportunities to compensate for the initial misstep. The company could, for example, have exploited an early beachhead in the $12 billion education market for PCs (it once dominated that market but now can claim only 10% of it), to push its way back into homes. But it failed to develop the aggressive sales force to do so. Apple has missed chances to own new markets, too. It introduced the world to pen-based computing with its Newton mobile device in 1993. Newton had its problems—it was clunky, hard to use, and probably ahead of its time. But it still seems baffling that Apple failed to capture a meaningful stake in the $3.3 billion market for personal digital assistants (PDAs), a business that by some measures is now growing faster than either mobile phones or PCs.
That Apple has been frozen out time and again suggests that its problems go far beyond individual strategic missteps. Jobs may have unwittingly put his finger on what's wrong during his keynote speech earlier that day in Paris. "Innovate," he bellowed from the stage. "That's what we do." He's right—and that's the trouble. For most of its existence, Apple has devoted itself single-mindedly, religiously, to innovation.
But wait. What can possibly be wrong with that? After all, we worship innovation as an absolute corporate good, along with such things as teamwork and leadership. Even more than these virtues, it has come to be seen as synonymous with growth. Political economists have assigned tremendous significance to it since at least the mid-20th century. Innovation is at the heart of Joseph Schumpeter's idea of creative destruction, for example: the process of "industrial mutation" that keeps markets healthy and progressive. Management theorists embraced the notion in the intervening decades, and a stream of academic papers and books promoting innovation as the critical element of business success issued forth from the likes of Peters and Drucker, Foster and Christensen. Innovate or die, we were told. It's the core of excellence and the root of entrepreneurship. It's the attacker's advantage, the new imperative, the explosion, the dilemma and the solution. (You can play this game at home, too, with any of the 49,529 titles that come up for "innovation" on Amazon.) And yet it's hard to look at Apple without wondering if innovation is really all it's cracked up to be.
Nor is Apple's the only case that should give us pause. Truth is, some of the most innovative institutions in the history of American business have been colossal failures. Xerox Corp.'s famed Palo Alto Research Center (Xerox PARC) gave the world laser printing, ethernet, and even the beginnings of the graphical user interface—later developed by Apple—yet is notorious for never having made any money at all. Polaroid, which introduced us to instant images decades before digital photography, collapsed under mismanagement and filed for Chapter 11 bankruptcy protection in October 2001. The Internet boom of the late 1990s, of course, now stands revealed as a sinkhole of economically worthless innovation. ("I know: Let's offer online ordering and free delivery of $1.49 bags of Cheez Doodles!") And Enron was arguably the most innovative financial company ever. So it turns out that not all innovation is equal. Not all of it is even good.
But the paradox of Apple is in many ways more disturbing because its innovations haven't been precommercial, like Xerox PARC's; they haven't been superseded, like Polaroid's; they haven't been frivolous, like those of the dotcom bubble; and they haven't been destructive, like Enron's. They've been powerful, successful, useful, cool. Since its earliest days, Apple has been hands-down the most innovative company in its industry—and easily one of the most innovative in all of corporate America.
Jobs was justly proud as he regaled his audience of 3,700 at the Palais des Congres in September. He prowled the stage for two hours, exulting in the details of Apple's numerous 2003 product launches. Chief among them were the new G5 desktop, the first 64-bit computer and the industry's fastest ever; a new operating system called Panther; a 15-inch laptop that comes with an ambient-lit keyboard for working in the dark; and Apple's first wireless mouse. Even by Apple standards, it was a banner year for snazzy new gear.
And as if that weren't enough, Jobs then reminded the crowd of the year's most important product debut, Apple's digital-music store known as iTunes. When it was launched in late April, iTunes became the first legal, pay-as-you-go method for downloading individual tracks of recorded music. Music fans and the recording industry alike loved it, and by the end of the year, more than 20 million songs had been purchased and downloaded off Apple's site. Soon the trade press was touting iTunes as "revolutionary," "groundbreaking," and a "paradigm shift" for the market. Time magazine recently hailed it as the "Coolest Invention of 2003."
But even in that banner year, Apple's creative energy hasn't amounted to very much in financial terms. For its fiscal year ending September 27, 2003, Apple reported just $6.2 billion in revenues, three-quarters of it from the sale of personal computers. The father of the PC—and, remember, the industry's number-one vendor in 1980—has since sunk to a lowly ninth, behind competitors Dell, Hewlett-Packard, and IBM, just for starters. Sadly, Apple is also behind such no-namers as Acer (seventh) and Legend (eighth). So much for innovation and creativity. These clone-makers, based respectively in Taiwan and China, exist solely to churn out gray boxes at the lowest possible cost. It may very well be that, without its relentless innovation, Apple would have simply ceased to exist long ago, going the way of Commodore and Kaypro in this unforgivingly Darwinian industry. But all its creativity certainly hasn't put it at the top of the food chain.
Where Apple was once one of the most profitable companies in the category, its operating profit margins have declined precipitously from 20% in 1981 to a meager 0.4% today, just one-fifth the industry average of 2%. And it isn't just the hardware manufacturers that are devouring Apple. Its chief competitor in software, Microsoft, earned $2.6 billion in its most recent fiscal quarter (ending September 30). That's nearly 15 times the $177 million in software sold by Apple in its most recent fiscal quarter and roughly equal to the profits that Apple has earned from all of its businesses over the past 14 years. In just three months.
With such examples as Apple in mind, a number of skeptics are beginning to ask whether our heedless reverence for innovation is blinding us to its limits, misuse, and risks. It's possible, they say, to innovate pointlessly, to choose the wrong model for innovation, and to pursue innovation at the expense of other virtues that are at least as important to lasting business success, such as consistency and follow-through. When it comes to economic value, Schumpeter's creative destruction may have an evil twin: destructive creation.
James Andrew, of the Boston Consulting Group, for example, argues that too many companies presume that they can boost profits merely by fostering creativity. "To be a truly innovative company is not just coming up with great new ideas, or products and services," he says. "It is coming up with ones than generate enough cash to cover your costs and reward your shareholders."
Andrew says companies can boost the odds of their success by choosing the most appropriate of three innovation models. The first and most traditional is the integrator model, in which a company assumes responsibility for the entire innovation process from start to finish, including the design, manufacture, and sale of a new technology. In general, large, well-heeled companies—Intel, for example—do best with this model. Second is the orchestrator approach, in which functions such as design are kept in-house, while others, including manufacturing or marketing, are handed off to a strategic partner. This model works best when speed is of the essence, or if a company wants to limit its investment. When Porsche couldn't meet demand for its popular Boxster sports coupe in 1997, for example, it turned to Finnish manufacturer Valmet rather than open another costly plant. Finally, Andrew says, there's the licensor approach, in which, for example, a software company licenses a new operating system to a series of PC manufacturers to ensure that its product gets the widest distribution at the lowest possible investment cost. That's you, Microsoft.
From the beginning, Apple appears to have employed the integrator approach—the model with both the highest costs and highest risks. On the one hand, it was the least appropriate choice for a startup with scant financial resources and a nonexistent customer base. But it was probably the inevitable choice for Apple's innovation-venerating culture, which demanded something akin to absolute artistic control. Jobs declined to comment for this story, but he has expressed an almost mystical reverence for the power of innovation over the years. In 1995 remarks to the Smithsonian Institution, for example, he compared innovation to "fashioning collective works of art" and said it afforded "the opportunity to amplify your values" over the rest of society.
The ambition to build the "perfect machine" drove Jobs and his cofounders, A.C. "Mike" Markkula and Steve Wozniak, to strive to build everything, from hardware to software, in-house regardless of cost. Even in those early days, peers like Microsoft were moving to specialize in one dimension of computing or another. (Apple now farms out much of its manufacturing, but won't say how much.)
This pursuit of perfection also led Apple's founders to opt for a closed operating environment on the early Macintosh computers. A closed computing environment is easier to control than an open one. Applications can be written to integrate with one another seamlessly, making the system less buggy. A better user experience!
"There was a lot of elitism at the company," says engineer and Apple alum Daniel Kottke. A Reed College classmate of Jobs who later traveled with him to India, Kottke became Apple's first paid employee in 1976. "Steve definitely cultivated this idea that everyone else in the industry were bozos. But the goal of keeping the system closed had to do with ending the chaos that had existed on the earlier machines." Kottke left Apple in 1984, a year before Jobs himself was forced out.
Apple's purist approach may well have made certain early innovations possible—networking, for example, which it introduced on the first Mac machines in 1984. Windows PCs didn't have printer networking until the mid-1990s. But time and again, Apple's obsession with controlling the entire process of innovation has also demonstrated the truth of Voltaire's dictum that the perfect is the enemy of the good. Today, the company has just 300,000 independent and in-house developers writing programs and making products for its operating systems, including the latest, OS X. More than 7 million developers build applications for the Windows platform worldwide.
APPLE'S DEMAND TO CONTROL THE ENTIRE PROCESS OF INNOVATION SHOWS HOW THE PERFECT CAN BE THE ENEMY OF THE GOOD.
Fewer developers mean fewer new products to run on Apple machines. That means fewer options for end users, which influences purchasing decisions, and therefore sales and profits. One example of a popular product not easily available for the Mac is the personal video recorder, or PVR. That's the TiVo-like device that lets users pause, rewind, and record live television programs on their PCs. Only two developers offer a PVR for the Mac: Elgato Systems' EyeTV and Formac's Studio TVR, retailing at $199 and $299 respectively. At least six software or consumer electronics vendors produce Windows-compatible PVRs, and Microsoft itself gives away PVR capability as a standard feature on Windows XP Media Center.
Apple has consistently rejected opportunities to adjust its innovation strategy to another model. Licensing its operating system to hardware manufacturers would have been an obvious choice. Yet when Jobs returned to Apple in 1997, he terminated the first and last licensing program, championed by former chief executive Gilbert Amelio. Jobs is reported to have told Apple managers that he feared "Mac knockoffs" would dilute the Apple brand.
Spurning The "Gee Job"
At the heart of Apple's innovation conundrum also lies a powerful cultural bias: the lionization of purely technical innovation. Ours is a material society. So it's natural that when we think of innovation, we are more inclined to think of objects, things that we can see, touch, and feel, and of inventors such as the Wright brothers and Thomas Edison. It turns out, though, that the most economically valuable forms of innovation often aren't the tangible kind. Instead, they are forms of innovation that we might belittle as less heroic, less glamorous: the innovation of business models. Don't think innovator-as-hero; think innovator-as-bureaucrat. Even Edison—who held 1,093 patents (more than anyone else in U.S. history) and who in- vented such doodads as electric light, the phonograph, and the motion picture—fared pretty badly when it came to choosing business models. He waged and lost one of the world's first technology-format fights, be-tween alternating and direct currents. And he abandoned the recording business after, among other things, insisting that Edison disks be designed to work only on Edison phonographs. Sound familiar?
In virtually any industry, business-model innovators rather than technical innovators have reaped the greatest rewards in recent decades, argues Gary Hamel, the chairman of Strategos, an international consulting company that focuses on helping businesses innovate successfully. Hamel points to Amazon, eBay, and JetBlue. Each company either delivered goods and services differently (by bringing distribution of books or secondhand goods to the Web) or more cheaply (by becoming a sort of Wal-Mart of the skies). Dell has done both. "Dell hasn't done anything to make PCs more attractive, more powerful, or easier to use. To the extent that there is innovation there, it has come from other companies," like Apple, Hamel says. "All of Dell's contributions have been in providing [other companies' technical] innovations to a wider audience at lower cost."
In some cases, innovation that we might think of as technical is actually business-model based. Henry Ford, for example, didn't invent the automobile—but he did develop the production process that drove costs down and enabled him to pay his assembly workers enough that they could afford cars of their own. "You can be tremendous at innovation on the technical side," Hamel says. "But if you can't wrap that innovation into a compelling value proposition, with a dynamic distribution strategy and attractive price points, then the innovation isn't worth much at all."
And it turns out that such value-driven business-model innovation is precisely the sort of thing that Apple is lousy at. Even back in 1989, for example, when the company still commanded a healthy 10% of the global PC market, some internal developers worried that the company couldn't stay competitive without expanding its customer base. And that, they felt, meant bringing down the cost of the Mac, which made its debut in 1984 at $2,500. That's more than $4,300 in today's dollars, which is why the Mac was first marketed to high earners and early adopters of technology. A group of those developers launched an unsanctioned project some called a "gee job," as in "Oh, gee, I'll do that in my spare time," to design a lower-cost Mac for schools.
Moonlighting for about a year, the team found ways to take costs out of the Mac, such as cheapening the floppy drive and using a less expensive, smaller power supply. In the end, they produced a fully functional Mac with a parts cost of about $340. Even with the typical 60%-plus gross margin on Macs at the time, the computer could have retailed for $1,000—far less than the standard Mac. But when the team presented the Mac LC (for low cost) to management, the marketing department nixed it.
"They said things about the computer weren't Mac-like enough, that it made the machine feel cheap," says Owen Rubin, a former Mac software developer who was on the team. Apparently, one sticking point was the floppy drive, which didn't inhale disks the way the original Mac did. Such subtle conventions cost money. Rubin and his team were sent back to the drawing board. The Mac LC hit the market in 1990, at $2,400. Adjusted for inflation, that's more than $3,300 today, meaning that the Mac LC really wasn't low cost after all.
AS BIG RIVALS SWARM, iPOD AND iTUNES MAY HAVE STARTED ONE MORE PARTY THAT APPLE WILL END UP GETTING TOSSED OUT OF.
A Digital-Music Donnybrook?
There's one last essential element to successful innovation that has often been missing at Apple: follow-through. As Howard Anderson, founder of both the consulting firm Yankee Group and the Boston-based venture capital firm Battery Ventures, puts it, "Innovation isn't the key to economic growth. Management is the key to economic growth." In practice, that means supporting product innovation with such things as a solid sales force, a strategy for collaborating with developers and makers of complementary products, and a strategy for customer ser-vice. "Companies that rely too heavily on creativity flame out," Anderson says. "In many ways, execution is more important. Apple is innovative, but Dell executes."
Apple's dismissal of such mundane pursuits is another paradoxical by-product of its restless, driven culture of creativity. Things such as sales and service are gritty, not cool; plodding, not imaginative; boring, not sexy. Standing in a darkened hallway just outside the jazz-filled salon of the Musee d'Orsay, technology consultant and Apple fan Anthony Knowles puts his finger on it. "By the time their products hit the market," he says, "they're on to the next thing."
The current focus of Apple's marketing efforts is clear to anyone walking the streets of Paris (or driving up Highway 101 in San Francisco, for that matter). Brightly colored silhouettes of hipsters dancing to their iPods are plastered on bus stops and billboards and flapping against the sides of buildings.
It makes sense that Apple would make so much fuss over the gadget. Since it was first introduced in October 2001, Apple has sold more than 1.5 million iPods, or about 300,000 per quarter today. This means that in two years, Apple has achieved roughly the run rate for the iPod that it took 25 years to achieve with its home PCs.
No one knows the cost to Apple to manufacture and market the iPod, and estimates of its operating margin range widely: 2.5% to 18%. But even at iPod's lowest list price of $299—and using a conservative margin estimate of 8%—it's clear that the iPod contributed substantially all of Apple's 2003 estimated operating income of $24.8 million, excluding onetime charges. Without the iPod, Apple is in trouble.
That's why recent releases of competing portable music players take on great significance. Selling for as little as $299, the Dell DJ is about $100 cheaper than the iPod with the same 5,000 song capacity. (A $500 iPod holds 10,000 songs). A third product, a 20-GB unit made by Samsung to work with Napster 2.0, costs $100 less than the 20-GB iPod, or about $300, and boasts a lot more features, including a built-in FM transmitter—to play songs on a car radio—and a voice recorder.
In terms of its innovative legacy, the iPod and iTunes together probably represent Apple's greatest achievement since the introduction of the Apple II in 1977. First, because they mark an important evolution inside Apple as it moves further away from its roots as a PC company and closer to a new role as a consumer-electronics and entertainment shop. Promoting the Mac as the "hub of a digital lifestyle" certainly indicates recognition that Apple may do better to cut its losses in the PC business. In this arena, Apple may benefit from its consumer focus, artful design, and strong brand equity.
ITunes also deserves recognition as Apple's first foray into business-model innovation. It is, after all, nothing but a novel distribution and pricing arrangement. Apple's ability to get users to pay for songs, rather than steal them, also convinced the recording industry that digital-music delivery was worth supporting. Without this leadership, Roxio Inc.'s Napster 2.0 and Dell/Musicmatch might never have negotiated their own digital-rights agreements.
Still, Apple may have learned these important lessons only partially, and too late. The iPod works only with the iTunes service, and has a $0.99 fee-per-song pricing structure. Dell/Musicmatch and Napster offer consumers more choice. Their Windows-based players and services are interchangeable; they sell individual songs and let users listen to (but not keep) as much music as they want for flat fees of less than $10 per month. Meanwhile, the $15 million or so that iTunes has generated in revenue thus far is statistically meaningless even for Apple. And after it has paid the music labels and covered its costs, Apple is left with just pennies per song. Even using a generous operating margin estimate, iTunes won't turn a meaningful profit until it hits Jobs's stated goal of 100 million songs sold. Jobs has said he hopes to do so by April, but at the current rate of 1.5 million songs sold per week, that is more than a year away.
And the competition is swarming. Dell and Samsung are challenging enough, but this business is about to turn into a battle of the titans. Wal-Mart is launching a cut-price online music store of its own—and now Microsoft and Sony, no less, are joining the fray. So Apple's venture into online music is beginning to look like yet another case of frustration-by-innovation. Once again, Apple has pioneered a market—created a whole new business, even—with a cool, visionary product. And once again, it has drawn copycats with the scale and financial heft to undersell and out-market it. In the end, digital music could turn out to be just one more party that Apple started, but ultimately gets tossed out of.
Sidebar: Tuscan Stone?
Early on a sunny Saturday morning in July, a line of eager people stretches down the block and around the corner. Sleeping bags, camping chairs, and food are scattered evidence of a long night on the sidewalk. These eager fans—groupies, really—aren't faithful followers of the latest cult rock band: They're here for the opening of the newest Apple store, in the sleepy Bay Area town of Burlingame, California.
When the doors finally open at 10 a.m. to the rousing strains of U2's "Beautiful Day," a double line of 35 Apple Store employees clap to the music and high-five the cheering sidewalk sleepers as they pour into the store. Ron Johnson, Apple's vice president of retail, enters a few minutes later with his two children and is greeted like a rock star. It's quite the reception for an opening that's hardly novel: Burlingame marks the 63rd addition to Apple's retail chain—and the fifth in an already crowded Bay Area market. But these folks don't seem to notice.
The store is done in iPod shades of white. "We chose hand-selected Tuscan stone for the floors—a stone that's somewhere between sandstone and limestone," Johnson says. "It's the same stuff Florence was built on." Each store boasts a Genius Bar, where customers can get technical support from Mac experts. Apple products are laid out on broad tables that are grouped by category: photo, music, movies. They're configured with speakers, iPods, and other peripherals so that users can see how an ideal "digital media hub" works. "The real reason we're here is to drive market share, so we devote most of the space in the front of the store to our products and the experience of them," Johnson says. "I'd love to see Apple get back to 15% market share someday."
Well, sure. But two years since the launch of Apple Stores, it's still unclear whether the strategy has moved the needle at all. Apple claims that 50% of all its retail-store buyers are new to Macs (some buying their first computer, others switching from Windows machines), but analysts such as Roger Kay, from the technology market research firm IDC, dismiss any notion of progress. "They're losing as many to Windows as they're gaining."
The real problem, according to Kay, is the enormous investment that Apple must make to open each retail location. Apple is holding leases on some of the most expensive real estate in the country, in places such as tony Michigan Avenue in Chicago and New York's trendy SoHo. And then there are those Tuscan stone floors. "Apple is creating a boutique environment, and they're doing it in a very expensive way," says Kay. "It doesn't seem very reliable as an approach for selling large quantities of goods."
Johnson argues that it's far too soon to claim success or failure on the issue of overall market-share growth—but that the retail stores are a very definite success. "Wal-Mart has taken 40 years to get where it is today," he says. "We've been open for just over two years. As we continue to add more and more stores, you'll see. We will move the needle." -Alison Overholt
Sidebar: Getting Innovation Right
If Apple teaches us anything, it's that effective innovation is about more than building beautiful cool things. A few thoughts for innovating well in your own shop:
- Not All Innovation Is Equal
Technical innovation will earn you lots of adoring fans (think Apple). Business-model innovation will earn you lots of money (think Dell).
- Innovate for Cash, Not Cachet
If your cool new thing doesn't generate enough money to cover costs and make a profit, it isn't innovation. It's art.
- Don't Hoard Your Goodies
Getting to market on time and at the right price is vital. If that means licensing your idea to an outside manufacturer or marketer, do it.
- Innovation Doesn't Generate Growth. Management Does
If you covet awards for creativity, go to Hollywood. Managers get rewarded for results, which come from customers.
- Attention Deficit Has No Place Here
Every innovation worth doing deserves your commitment. Don't leap from one new thing to another. If your creation doesn't appear important to you, it won't be important to anyone else.
A version of this article appeared in the January 2004 issue of Fast Company magazine.