Do you remember the day when Netscape became a $2 billion company, or when Amazon upended our online expectations by introducing 1-Click shopping? How about when Napster faced down Metallica and Twitter showed us what was really happening on the streets of Tehran? These are the moments of innovation that changed our world and continue to point us to the future.
It was the IPO pop heard round the world: How Netscape’s triumph signaled the arrival of a new business model—and a new mind-set.
I first realized that August 9, 1995, would be a day to remember when two coworkers bet each other on the opening and closing price of a new stock issue from a Mountain View, California–based company called Netscape Communications. Their excitement was electric, and it only grew as the stock doubled its offering price from $14 to $28 that morning; by the time it started trading after 11 a.m., the stock had risen to $71. It ended the day at $58.25, giving the company a market capitalization of more than $2 billion. Not bad for a 16-month-old startup with just $16 million in lifetime revenues.
That sort of thing simply did not happen, but Netscape’s IPO symbolized many things about the profound changes set to take place in business and culture. Netscape represented a bet on the potential of the Internet to transform society. It was the original Internet platform—something virtually every technology startup now aspires to be. And it was the proto-example of twentysomethings moving to California to create a company and seek their fortune, led by a visionary like Marc Andreessen.
But what came to be young Netscape’s defining characteristic—and its most lasting contribution to business—was speed. At a time when software was still sold on disks in shrink-wrapped boxes, Netscape started releasing beta software on the Internet month after month. It would get feedback from customers and pump out two new versions by the time those software boxes even reached stores. Its pace of innovation—and its rush to an IPO—made it the company to emulate. In November 1995, Fast Company profiled Netscape as the ur-example of a company living the new rules of competition, writing, "In an economy where even breakthrough technologies become obsolete within a few years, where even the deadliest competitors must change their game in the face of changing circumstances, Netscape Time may be the company’s most enduring invention."
Today, we expect—and even demand—the constant iteration that goes into improving the software that runs out lives and our work. And speed is the value that determines winners and losers in our hypercompetitive culture. So it’s ironic, then, and a little sad, that speed killed Netscape. The company moved so fast that it never settled on a business model and didn’t see the opportunity to transform itself into a portal until it was too late. Many people, even Netscape employees, blame Bill Gates’s anti-competitive practices for the company’s demise. Fact is, by late 1997, Microsoft had built a faster web browser than Netscape. Netscape’s breakthrough technology had been eclipsed, and it didn’t adapt to the new reality. Being first has its advantages, but agility is what’s required to win. —David Lidsky
If anything captured the early promise of the Internet, it was Craigslist. Everything about the classifieds site—its open environment, freewheeling nature, democratized content—embodied the information superhighway. A community hub that evolved into a place where you could look for an apartment, find a job, sell goods, and search for love, Craigslist also looked like the Internet, with its blue links and HTML–gray borders and text boxes. And it still does. What’s most impressive about the site, though, is that its utility is so profound that it has not only survived but receives a whopping 50 billion page views per month. And while Craigslist might’ve contributed to the decline of the traditional-media business, which long relied on local classifieds for revenue, it also gave rise to an endless number of online services, from real estate sites to dating apps. Where some saw simple listings, others saw (and still see) opportunity. Founders of companies such as Zillow, Hinge, and TaskRabbit found inspiration—and much room for improvement—in Craigslist’s stripped-down design and no-questions-asked approach. "Craigslist was one of the first websites to publish real estate listings in real time, and that changed the landscape for home shopping," says Zillow CEO Spencer Rascoff. "But when they didn’t innovate, they set themselves up for disruption." —Austin Carr
In the mid-1990s, a mere 8% of U.S. consumers felt the web was secure enough to make purchases with a credit card. This was before PayPal and iTunes, when it was still common for people to complete eBay transactions by mailing cash or a check. All that changed in 1997, when Amazon introduced a small feature called 1-Click, which enabled customers to securely store their credit-card information online and make all future purchases with a single tap of their mouse. Not only did it make shopping exceedingly and addictively easy, it also brought the magic of the Internet to life, helping to usher in an era of increasingly seamless e-commerce. CEO Jeff Bezos, naturally, patented the 1-Click idea, and even Apple felt that the technology was so core to the future of shopping that it licensed it for its online store. (Barnes & Noble, on the other hand, ended up in a lawsuit with Amazon after attempting to skirt the patent with its own Express Lane checkout option.) The feature remains a staple of the Internet and is arguably a precursor to many of today’s mobile apps, including Uber, Instacart, GrubHub, and Starbucks. Amazon, in the meantime, has continued to reduce the friction of shopping, from its introduction of Prime in 2005 to the one-hour delivery service it's currently testing in a handful of U.S. cities. —Austin Carr
Before the first W opened on a quiet stretch of Lexington Avenue in Midtown Manhattan, staying at an American hotel chain usually involved starchy floral comforters and sad, beige carpeting. Though trendy boutique hotels like the Royalton and the Mondrian were already prioritizing design at the time, the W was the first to take a high-end aesthetic and make it available to everyone regardless of status. By the following year, when the young brand opened its 10th outpost, W Hotels were packing in celebrities, socialites, and—most significantly—loyalty-points-hoarding business travelers.
The W was the vision of former Starwood Hotels & Resorts CEO Barry Sternlicht, who, a few years earlier, had realized that while brands such as Pottery Barn and Banana Republic were bringing a new sensibility to the shopping mall, most hotel chains were still peddling an outdated look and feel. So he pulled together a team of architects and designers to try to change that. Their mandate: "To bring good design to people who had never experienced it in a hotel before," says Theresa Fatino, W Hotels’ former VP of brand development and design. The first W offered chic, minimalist residential rooms and dramatic public spaces with gossip-page-worthy restaurants and lobby lounges. It was an intoxicating formula, and W Hotels quickly became a white-hot brand.
Today, the chain includes 47 hotels around the world and has 40-plus more in development. Its formula has been much-copied: Every major hotel group now has a boutique line (including Marriott’s Edition and Hyatt’s Andaz), and even the trendsetting Ace hotels, with their gathering-spot public spaces, owe a debt to Sternlicht’s original vision of inclusive design. With the W, ordinary travelers could finally enjoy a sophisticated design sensibility along with a good night’s sleep. —Amy Farley
People were nowhere near as jaded about email in March 1999 as they’d eventually become. So when a missive claiming to be an "Important Message" began popping up in inboxes around the world, many folks reflexively clicked on it—and then, as instructed, opened the Microsoft Word file attached.
Bad move. The Word doc—seemingly a list of porn sites—was infected with a virus and automatically emailed itself to the first 50 people in the recipient’s address book. It set off a nasty chain reaction that spread to as many as 20% of the planet’s business PCs, according to some estimates.
The virus’s author called it Melissa (after a Miami stripper) and ended up spending 20 months behind bars. The scheme was more annoying than malevolent, but because it was so effectively (and socially) engineered, it was the first cybersecurity attack that many people experienced directly. In the years since, online threats have grown far more widespread and dangerous, with hackers finding ways to steal credit-card details, sensitive corporate data, and even celebrities’ private photos. —Harry McCracken
What is music worth? For decades, the answer had been simple: whatever it cost at your local record store. But with the launch of Napster in 1999, things got confusing. Suddenly there was no reason to hit Tower Records if you wanted the new Metallica single. You could download it—along with pretty much any other song you could think of—without leaving your house . . . and without spending a cent. Music fans went nuts for Napster, and while many artists were also excited, others pushed back. Metallica drummer Lars Ulrich was the most outspoken opponent, and in April 2000, his band filed a lawsuit against Napster. It was the first time an artist sued the file-sharing service, and it helped usher in a profound reorganization of the music industry. Napster cofounder Sean Parker—who was also Facebook’s founding president and today is on the board of Spotify—reflects on his company’s revolutionary impact. —Rob Brunner
Fast Company: Why did Napster connect in such a big way?
Sean Parker: The obvious part is it was about music. It was the first time ever that the entire catalog of recorded music was available in one place. And it was also the first product that was truly a pure expression of what the Internet could do. Nothing up to that point had captured the possibility of hundreds of millions of people coming together and sharing media.
How did you react when you heard about the Metallica suit? It must have been terrifying.
What’s amazing is that nobody was ever terrified. There was just a sense of disbelief. Suddenly you’re thrown into this dreamlike world where mass-media icons and celebrities are filing litigations against you. You have an understanding that Napster is just symbolic. It’s not that they’re after you, as much as they’re after this cultural shift that they’re about to be left out of. Napster was deeply offensive to the cultural norms that existed at that time and to the power structure of the recorded-music business that existed relatively unchanged for [many] years.
You could almost talk about it in fantastical terms. We had created this as some sort of mad-scientist experiment. We unleashed some kind of suppressed force in the universe. [It was as though] you’re just an observer of this sort of spirit force that’s going to war with some existing system. You have a front-row seat to watch this epic showdown, but you’re not functionally in control of any of it. In hindsight, you realize that Napster has a huge level of importance in terms of this larger battle between the top-down centralized mass media and the bottom-up decentralized social media we were moving into. Napster is that critical turning point: the first battle that is fought in this war for the hearts and minds of the media business itself.
Could you sort of understand where Metallica was coming from?
Of course. We really cared about music and were huge fans. So anything short of growing the industry and making sure artists got paid was not an acceptable solution. But the real battle wasn’t between whether artists should get paid or they shouldn’t. It was between the pre-Internet world of media—which is controlled by gatekeepers, completely centralized—and the post-Internet world of media that was social and decentralized. Napster became the first point where artists were forced to confront the reality that as this Internet thing began to catch on, the degree of control they were able to exercise over their content was going to change. Now Lars and I have become really good friends—so much so that he came to my wedding. We hang out in San Francisco all the time and reminisce.
How does the country’s largest natural-foods grocer prove it’s not just for health fanatics? Win over the picky palates of New Yorkers: In February 2001, Whole Foods’ Manhattan flagship opened in the Chelsea neighborhood, and its aisles were soon flooded with stylish urbanites. That year, the USDA created broad standards to regulate the influx of organic producers. "It was an affirmation that organic is real," says Margaret Wittenberg, global VP of quality standards at Whole Foods, which now has nine stores in New York City (and 400-plus nationwide). Today, 84% of Americans buy organic food at least occasionally, and Whole Foods continues to push the industry in new directions. It compelled competitors to make organic more affordable with the launch of its 365 Organic Everyday Value line of products in 2002, and brought prominence to sustainable seafood by requiring vendors in the U.S. and Canada to undergo audits of their products beginning in 2008. The grocery chain has also helped to grow the organic ecosystem by offering low-interest loans to small-scale producers. Next up: Whole Foods is bringing transparency to food labeling with its vow to clearly label all GMO-modified products sold throughout the U.S. and Canada by 2018. —Jessica Hullinger
Richard Florida was a little-known urban studies professor at Pittsburgh’s Carnegie Mellon when he wrote a book noting a new dynamic playing out in American cities, one in which economic growth was being driven by a burgeoning force of creative workers. The Rise of the Creative Class became a best seller and, following the dotcom bubble, a handbook for cities hoping to engender their own creative resurrections by attracting the same kind of talent as Austin and San Francisco—from engineers and architects to poets and musicians. Florida’s book also resonated with young people who saw themselves in the creatives he described: individualistic, meritocratic, and open to new people and ideas. Entrepreneurship and innovation, rather than traditional financial success, were their prime motivators. As this generation of self-conscious creators pursued their passions, they helped refashion cities in their image. High-end coffee purveyors, artisanal boutiques, and design-driven coworking spaces spread from Brooklyn to Oakland (and spawned countless parodies). Today, whether aware of Florida’s book or not, the creative class continues its urban alterations, most recently in his former backyard. In The Rise of the Creative Class, Florida criticized Pittsburgh for its failure to retain creative people. These days, the city is full of them. —Amy Farley
When The Human Genome project finished its DNA map, it had big implications for science, medicine, and business.
It was one of the most ambitious projects ever undertaken by the scientific community: a multibillion-dollar, 13-year effort to map the 3 billion DNA letters in the human genome. On April 14, 2003, an international team dubbed the Human Genome Project announced that it had completed its mission—a milestone that helped spur major new directions in research, launch thousands of businesses, and create hundreds of treatments and diagnostic tools. Among the breakthroughs that have resulted from the Human Genome Project: therapies targeting certain forms of blindness and cystic fibrosis; genetic tests that predict the effectiveness of breast-cancer chemotherapy, drug cocktails for hepatitis C, and colon-cancer treatments; and advanced prenatal testing and preconception carrier screening. Thanks to such work, the overall economic impact of the Humane Genome Project is an estimated $966 billion. —Adam Bluestein
Is there anything that Chinese conglomerate Alibaba Group doesn’t sell? For your next beach vacation, you could book flights on Alibaba-owned Alitrip, buy sunscreen on Alibaba’s Tmall, and archive your sun-kissed selfies on Kanbox, Alibaba’s cloud-storage platform. And with Alipay, a payments platform similar to PayPal, you could do it all via digital wallet.
When Alibaba introduced Alipay in 2004, it was a signal that the then-five-year-old company’s growing ambitions extended beyond pure retail. Still, it would have been hard to imagine that a decade later Alibaba would achieve the largest IPO in history, a $25 billion debut on the New York Stock Exchange. At the time of Alipay’s launch, only 7% of shoppers in China had access to the Internet, and the e-commerce market was worth just $750 million.
Then, revolution arrived. Alibaba founder Jack Ma anticipated the value that smartphones would unleash when paired with the consumer demand of China’s fast-growing middle class. His strategy: to become the one-stop shop for mobile consumers, with Alipay as the central infrastructure. Retail, messaging, transportation—if there’s an app for it, there’s an Alibaba solution on hand. Now the annual transaction volume on Alibaba’s primary e-commerce platforms is more than double what consumers spend on Amazon.
But there are challenges ahead. In China, competitors like Tencent’s JD.com are gaining share of the e-commerce market. At the same time, signs of slowing economic growth sparked a massive downturn in Chinese equities this past summer. The uncertain outlook at home has put pressure on Alibaba to prove that it can succeed in other countries. The company is well positioned for emerging markets, thanks to its expertise in operating in places where logistics are poor. But finding a way to grow in the U.S., a mature market where quality is paramount, is a different story. In the meantime, some Silicon Valley brands are laying the groundwork to compete with Alibaba on its home turf. Apple has filed paperwork in Shanghai to launch Apple Pay, a direct Alipay competitor. —Ainsley O’Connell
It used to be that venture capitalists expected a certain amount of control in return for their check, and young entrepreneurs often found themselves on the sidelines of their own companies. But with the success of formidable founders such as Mark Zuckerberg, who in 2005 raised $12.7 million from Accel Ventures without ceding control of Facebook, the power balance began to shift. That same year, Silicon Valley incubator Y Combinator launched with the then radical notion that founders should be cultivated, not marginalized. Instead of just pumping money into companies, YC created a now-biannual program that selects promising founders, guides them through a rigorous grooming program, then helps them pitch their companies to other investors. Current president Sam Altman—whose company, Loopt, was part of the first-ever YC class—and founding partner Jessica Livingston recall how that idea transformed modern venture capital. —J.J. McCorvey
Jessica Livingston: At the time, there wasn’t anything available for founders who were looking for seed-stage capital. It was either talk to your rich uncle or you had to have a whole business plan mapped out. [YC cofounder] Paul [Graham] and I thought, We should be the first gear for startups that just need a little money so [founders] can quit their jobs and focus on working on an idea. We called ourselves Y Combinator, put up a website, and launched the Summer Founders Program. We chose eight startups. In that batch were Alexis [Ohanian] and Steve [Huffman] of Reddit, and Justin [Kan] and Emmett [Shear] of what would later be known as Twitch.
Sam Altman: I was at Stanford when YC was announced, and it spread like wildfire through the computer science department. All of us were like, finally, this is made exactly for us: This is the amount of money we need, and these people sound like they are really cool to work with, unlike venture capitalists, who seem scary.
JL: We didn’t want to be a traditional VC. We wanted to be really founder friendly. All of the funding was for common stock. No preferred stock, no extra bells and whistles. Back then, [entrepreneurs] had to get a lawyer and spend, like, $20,000 just to incorporate. Instead, we’d do all the legal paperwork for them and teach them about the business side of things, because we were specifically targeting computer science students and programmers.
SA: With VCs, there were all these terms that were deliberately obscuring what was going on, and I think they liked it that way. Liquidation preferences and ratchets and anti-dilutions and drag-alongs and no-shops. It was this club that was hard to break into. They would only talk to you if someone they knew introduced you. YC was the first that said very clearly, Here is how we work, here is how you approach us, here’s how we make our decisions. We don’t need to buy half your company.
JL: We wanted to make quick decisions, because venture capitalists take a long time to decide to invest in you—months, in some cases. We told applicants "yes" or "no" the same day we interviewed them. That was never done before us.
SA: There’s a big category of stuff that is not as important as building a product, but if you don’t know it, you can’t be successful: how to raise money from investors, how to hire and manage people. We left YC knowing all of that. It takes a long time to learn to be a great coder; it does not take a long time to learn all of that other stuff. What happened in 2005 was a shift from investors holding most of the leverage to founders holding most of the leverage. That was a tectonic change.
JL: There’s also just a more diverse group of founders. The biggest lesson was that more people will start startups if we make it easy for them.
It was the year of Al Gore’s inconvenient truth and Elizabeth Kolbert’s Field Notes From a Catastrophe, and public opinion around climate change seemed to be coalescing. So why weren’t more Americans buying energy-saving compact fluorescent lightbulbs, or CFLs? In essence, consumers didn’t understand why CFLs cost so much more than incandescents and were dubious of their light quality. As a result, nearly three decades after being introduced in the U.S., CFLs were used in only about 6% of homes. How Walmart nearly single-handedly changed that is a study in how a company can use its size for good.
In November 2006, Walmart’s then–CEO, H. Lee Scott Jr., announced an ambitious initiative: The company would sell 100 million CFLs by the end of the following year. When put into use, these bulbs would save consumers $3 billion in electricity costs and cut U.S. power needs by the equivalent of 450,000 homes. Lighting manufacturers balked, and Walmart’s critics leveled charges of greenwashing. But the retailer went all-in, redesigning displays, increasing selection, educating consumers, and launching an affordable private-label brand. The campaign worked and Walmart blew past its goal: In 2007, it sold 162 million CFLs. The company’s massive purchase of CFLs allowed it to dictate specs to its suppliers: It required Energy Star–rated bulbs that met strict performance standards and got manufacturers to reduce the amount of mercury in CFLs. Walmart’s actions also helped secure passage of the Energy Independence and Security Act of 2007, which ordered the gradual phase-out of inefficient incandescent bulbs by 2015.
In the second quarter of 2014, shipments of CFLs surpassed traditional incandescents for the first time. But most experts believe the future belongs to LED bulbs, which are even more efficient than CFLs—and increasingly affordable. Today, a CFL, LED, or next-generation incandescent fills one in three U.S. light sockets, helping to keep millions of tons of CO2 from entering the atmosphere. —Adam Bluestein
Remember life before the iPhone? We used to watch movies on our living-room TVs, hail taxis by raising an arm, and lovingly crumple photos of our kids into leather carriers we called "wallets." In the eight years since Steve Jobs introduced those ubiquitous little devices in June 2007, they have altered—and, yes, improved—our daily routines in countless ways. But the smartphone’s impact has been far greater than just changing the UX of our lives. It’s also affected our digital infrastructure, our design sensibilities, and even the way we think about modern labor practices. Here are some of the biggest transformations brought on by the iPhone. —Mark Wilson
1. THE SOFTWARE REVOLUTION A year after launch, the iPhone introduced the catchy word app and the idea of software that, with a couple of button presses, installs itself and appears as a simple icon. It changed not just how we talk about software but how we consume and use it.
2. TECHNOLOGICAL MINIATURIZATION Computers used to be boxy machines that tethered you to a desk or, at best, coffee-shop Wi-Fi. But the iPhone and other smartphones have allowed us to be constantly connected. As a result, Americans now spend more time on the Internet via mobile devices than desktops and laptops.
3. A SEAMLESS INTERFACE Thanks to the iPhone, touch screens have become so familiar that children walk up to TVs and are baffled when they can’t swipe. And landmark Apple interactions such as pinch-to-zoom have removed the barriers of mice and keyboards to make information tangible and easily accessible.
4. IMAGE-CONSCIOUS CONSUMERS By including a camera, the iPhone ensured that people would be able to take photographs at any time, wherever they went, in virtually unlimited quantity. It also gave rise to photo-based social apps like Instagram and Snapchat. As a result, photography has become central to the way we communicate with each other.
5. PRODUCTS AS ACCESSORIES More than even Apple’s beloved Macintosh, the iPhone has shown how important it is to connect elegant industrial design with beautiful and intuitive interfaces. Words like simple, magical, and delightful now rule product design. And our devices are as much fashion statements as they are digital tools.
6. THE NEED FOR SPEED In order to keep up with the increased demand for data following the iPhone’s initial release, AT&T—the product’s original exclusive carrier—introduced nationwide 3G service, at a cost of $20 billion. Today, every major carrier in the U.S. supports 4G speeds that are fast enough to rival that of broadband.
7. OUTSOURCING'S DARK SIDE Few people thought much about where their devices came from prior to reports of labor violations at factories run by Apple’s Chinese manufacturing partner, Foxconn. Apple has taken steps to improve the situation, but the revelations of unsavory factory conditions were a reminder of the unintended costs of building popular products.
8. A NEW POWER STRUCTURE Remember when Microsoft ruled the world? What about BlackBerry or Nokia? Everything shifted after the iPhone, which sent other phone manufacturers scrambling. Samsung and Xiaomi have built multi-billion-dollar businesses drafting off of Apple. Others were too slow to react and have paid the price.
9. DRAMATIC ENTRANCES With the iPhone unveiling in 2007, Steve Jobs perfected the art of turning new-product introductions into theatrical, buzz-worthy productions. iPhone reveals are now closely watched and tightly choreographed moments—more cultural events than device announcements.
Economic downturns can be surprisingly fertile soil. Apple, Disney, and General Motors were all born from recessions—evidence that when the economy is down, some of the most resilient ideas rise up. The same was true in 2008. Businesses closed doors and layoffs came in waves. But even as Wall Street crashed, green shoots were appearing in Silicon Valley and elsewhere, thanks to entrepreneurs who saw opportunities for innovation in the downfall. It’s no coincidence that the crowdfunding movement emerged that year, marked by Indiegogo’s debut at the Sundance Film Festival. The sharing economy also took root, with Brian Chesky, Joe Gebbia, and Nathan Blecharczyk launching Airbedandbreakfast.com to accommodate attendees at the Democratic National Convention in Denver that August. And as the year shuddered to a close, future Uber founders Garrett Camp and Travis Kalanick began brainstorming ways to fix San Francisco’s cab problem. —Jessica Hullinger
Neda Agha-Soltan, only 26 years old, was shot dead on June 20, 2009. The philosophy student was on her way to watch the protests against the victory of Mahmoud Ahmadinejad in the Iranian presidential election when she was struck by a single bullet, believed to have been fired by security forces.
She died surrounded by onlookers, who held her as she bled—an act that was captured by a shaking cell-phone camera and uploaded to Facebook and YouTube. The video quickly went viral on Twitter, and soon media outlets like the BBC and CNN began broadcasting versions, turning Agha-Soltan into the face of the Green Movement in Iran and drawing attention to clashes between the Iranian people and their government. Never before had citizens in crisis been able to disseminate their stories so easily to millions of strangers around the world.
Aided by hashtags such as #iranelection and #Neda, Twitter played an especially outsize role throughout the summer. As the Iranian government shut down text and cell-phone service inside the country, protesters used the still-nascent social network to communicate vital information—police activity, protest locations, and calls for medical assistance—to one another and to activists outside the country. When the service was blocked inside Iran, protesters turned to websites that helped post tweets on their behalf. And international news outlets combed through Twitter to gather eyewitness accounts of the turmoil. By the end of the year, the proportion of Twitter users outside the U.S. had increased from 38% to 49%.
Agha-Soltan’s death set the stage for an era of citizen reporting and social media activism, which exploded during the Arab Spring less than two years later. Today, it’s standard practice for activists from Hong Kong to Baltimore to capture and share images of every protest sign, arrest, and canister of tear gas they encounter. Along with every bullet. —Anjali Mullany
The basement of a Hyatt in L.A. might not seem glamorous, but for the YouTube-video makers who gathered there for the first-ever VidCon—held over the course of three days in July 2010—it was the most exciting spot on the planet. Organized by scene-leading VlogBrothers creators John and Hank Green, the first VidCon brought together the medium’s most passionate participants. Five years later, VidCon is a 21,000-person event that attracts major brands, and creators such as Grace Helbig have become mainstream stars. John (also known for writing novels such as The Fault in Our Stars and Paper Towns) and Hank reflect on the event. —Nicole LaPorte
Fast Company: What was the vibe like that first year?
Hank: It was always a big deal to me, even if it only had 1,800 people. It was across the street from [top entertainment-business talent agency] CAA, and CAA had no idea that it was happening. But to me it felt like a really important cultural thing.
Now Hollywood studios screen movies at VidCon and brands feel they need to be there. Does part of you want to protect that original spirit?
Hank: Our goal is to reflect what online video is doing. We’re not going to impose our values and say, "No! Online video should be grassroots and independent!" I like that stuff, and in a lot of ways online video works best in an independent and authentic way, but it also works well in other ways.
John, you’re a big-deal novelist. Do you inhabit both the mainstream and underground?
John: The things that I want to do with my life are making videos and writing books. I feel really, really lucky that I get to do both. More people watched my video about why the American health care system is [so] expensive than saw Paper Towns. I guess health care analytics videos are less underground than Hollywood movies.
How has the maturation of the platform changed things?
Hank: Early on, no one knew what the heck was going on, which was kind of exciting but also kind of paralyzing. Now we’ve gone further down the path of what’s possible. You can make a movie or write a book or start a video-game company or do a clothing line. Whatever you’re interested in, you have this activation energy of these really lovely audiences. You can go in infinite directions.
When Google cofounder Larry Page returned to the company as CEO in 2011—after relinquishing the role to tech-industry veteran Eric Schmidt a decade earlier—he focused the sprawling organization on launching projects Google calls moonshots: efforts that strive to change the world by being 10 times better than anything that has come before. The company operates two research arms devoted to such efforts, Google X (which Page helped form) and Advanced Technology and Projects. Today, Google is reengineering the entire company around these otherworldly endeavors, turning core products, such as its search engine, into merely a division of a corporate parent called Alphabet Inc. —Harry McCracken
Businesses focused on social good have been around for decades, but the concept really took off in 2012, as new laws in several states—including the startup-friendly California and New York—allowed mission-based companies to reincorporate as benefit corporations and protect themselves from shareholders who might not prioritize values such as employee wellness or eco-friendliness. One of the first companies to sign up: the Ventura, California–based outdoor clothier Patagonia, which gives 1% of sales to green organizations, closely monitors conditions in partner factories, and uses eco-friendly materials in 75% of its products. When it registered as California’s first benefit corporation in January 2012, it helped accelerate the crusade across the U.S. Another big proponent of the regulation was B Lab, a not-for-profit that offers its own nonlegal framework for these kinds of companies, certifying them as B Corporations. According to cofounder Jay Coen Gilbert, the same unrest that fueled the Occupy movement of 2011 helped propel social enterprises mainstream. "It was clear that the interests of business and society were no longer aligned," he says. Today, there are more than 3,000 benefit corporations and more than 1,400 Certified B Corporations, including Warby Parker, Kickstarter, and Etsy. —Kenrya Rankin
The most shocking thing about the 2013 launch of Netflix’s House of Cards wasn’t Frank Underwood’s shady political dealings—it was the way the show shattered the traditional television model. "Netflix was the first to recognize and exploit evolving consumption behaviors," says Liam Boluk, a media-strategy consultant at Redef. Here are four ways the Kevin Spacey drama has forged a new approach to TV. —Nicole LaPorte
TV production typically involves commissioning a test pilot episode before committing. Executives have long groaned about the inefficiency—and cost—of making dozens of pilots, only to see a fraction wind up on TV. Netflix paid a whopping $100 million up front for two seasons of House of Cards—before seeing a minute of footage. Six Emmys and dozens of nominations later, it’s safe to say the gamble worked.
Cards was the brainchild of director David Fincher (Gone Girl), and Netflix approached it like a film, giving the creators unusual control. This artist-first mentality was further enabled by Netflix’s interest in quality and buzz over ratings. "Art is best handled by the artist," says Netflix’s chief content officer, Ted Sarandos. "Let them realize their vision. Win, lose, or draw, it happens on the merit of the programming, not on the perceived instincts of a TV executive."
Netflix launched all 13 first-season episodes at once, and fans gobbled them up in marathon sessions. The ensuing reaction turned the show into a phenomenon literally overnight. The strategy also allowed for a richer narrative, since story arcs no longer had to be neatly resolved at the end of each installment. "This was crafted to be watched in multiple episodes or possibly all at once," says Sarandos. "It had to hold up to a different level of continuity and logic scrutiny, which is one of the things that makes the show so strong."
Netflix pays careful attention to viewer data. The company has created algorithms to help decide whether to green-light shows (Spacey and Fincher were both popular on the service). Big data even informed the way Netflix promoted Cards. Subscribers received different trailers based on their viewing habits: Political junkies saw more of Spacey, while women got more of his wife (Robin Wright). Since then, Netflix has used its data expertise to create and promote other series, such as Orange Is the New Black.
The "move fast and break things" philosophy of innovation gets messy when one of the things you’re breaking is the law. In 2014, some of the most promising companies hit their first major roadblocks with regulation, among them Airbnb and Uber, each of which went head-to-head with city leaders. How they approached and resolved their regulatory battles spoke volumes—and still does, as their legal scuffles continue. —Sarah Kessler
Facing lawsuits and bans from South Korea and France for flouting transportation rules, Uber doubled down on a destination closer to home: Portland, Oregon. Many of the towns surrounding the city had already embraced the car-sharing app, but Portland Mayor Charlie Hales had been hesitant until working out an agreement with the service. In July 2014, Uber posted a taunting blog entry titled "Hey Portland, We Are Just Across the River." Five months later, without the mayor’s blessing—or knowledge—Uber launched its Portland service and even threw a party, which included a spot to take photos with protest signs and create postcards to mail to Hales. Portland sued Uber shortly after, with Hales asserting that "no one is above the law." Despite Hales’s resistance, Uber effectively forced Portland’s hand. The two sides struck a deal later in December that put the lawsuit on hold and assembled a task force to figure out a new set of rules. The task force eventually suggested a trial period for Uber, which the city has since extended.
Airbnb’s passive-aggressive skirting of San Francisco’s laws banning short-term rentals and requiring a 14% tax on hotel rooms finally caught up to the six-year-old startup. David Chiu, president of the city’s Board of Supervisors, announced that San Francisco had about 100,000 short-term-rental violations a year—and the city had begun cracking down. With the City Planning Department threatening fines for illegal rentals and landlords evicting hosts for violating leases, Airbnb advised its embattled hosts to find lawyers. Behind the scenes, it petitioned Chiu to craft a more accommodating law and agreed to pay outstanding hotel taxes, estimated at $12 million annually. In October 2014, the Board of Supervisors passed the "Airbnb law" to allow short-term rentals (with restrictions). Though Airbnb’s San Francisco conflicts aren’t over—it’s spent $8 million fighting an initiative that would limit how long hosts can rent out their homes—the law legalized its business in the city.
There is something at once awe-inspiring and a little overwhelming about the UN’s latest effort to end poverty and improve the lives of the disadvantaged around the planet. Adopted in September, the organization’s Sustainable Development Goals (SDGs) are an exhaustive agenda covering everything from gender equality and education to climate change and hunger. They’re a sequel of sorts to the Millennium Development Goals launched in 2000. That initiative, fueled in part by a booming China, made great strides in some areas (maternal health, malaria prevention) while falling short in others (climate change, gender inequality). So why have more faith in this round? For one thing, the private sector is getting more involved. Kathy Calvin, president and chief executive officer of the United Nations Foundation, talks about the role of innovation and entrepreneurship in meeting these goals. —Darrell Hartman
Fast Company: What makes the SDGs different from their predecessors?
The previous goals were about ending poverty; these are about moving to prosperity. It’s a totally different mind-set. And I really think the SDGs are a turning point because they call on all sectors—the private sector in particular and entrepreneurs in double particular—in coming up with solutions. The door is open to them in a way that it never was before.
Where are the solutions coming from?
It’s not just government aid or Silicon Valley. They’re actually being developed in places like Ethiopia. [People are] solving problems at the ground level, like finding a better way to deliver medicine. They’re really getting at where the hurdles have been for reaching scale. We just need to recognize them, get them some partnership, and start moving.
Where does tech come in?
Mobile phones are a great example. This is a tool that’s in people’s hands already. The question is, how do you take something that is in the field and empower it further? You’re using it to deliver reproductive-health messaging for moms, to check on crop pricing, to remind people to take their meds. There are great uses that enable the kind of progress we want to see. Another big problem is that data is old and cold. In gender, for example, we’ve identified 28 areas where data is either out of date, inadequate, not even asked for. But the private sector, which has known how to use data for a long time, is showing us ways to be creative about it. For example, students are now reporting when teachers are not showing up to schools, which has been a massive problem.
So what’s the relationship between the public and private sectors?
I think one of the hallmarks of this era will be a fluidity between them. The whole notion that companies have the market cornered on smart, efficient operations and non-profits on caring about a community is old-fashioned. Both have to have both.
[Light painting, Photo: Christopher Noelle]
A version of this article appeared in the December 2015/January 2016 issue of Fast Company magazine.