Facebook didn’t spend eighteen months and tens of millions of dollars developing more powerful–and more energy efficient–data centers and servers so that it could go into the hardware business. Rather, it hopes its innovations will spur its hardware suppliers, who normally build this stuff themselves, to take the blueprints and expand on them to produce the new components and systems Facebook needs to grow ever larger.
That’s why yesterday the company published the fruits of its labors–the blueprints and specs of the new super-efficient system it’s developed–out in the open, for all to use as they please. It’s part of what Facebook is calling the Open Compute Project, which is taking a page from the open-source movement in software.
“We want server design and data-center design to be something people can jointly collaborate on,” CEO Mark Zuckerberg said at the unveiling Thursday. “By sharing this we think it’s going to make it more efficient for this ecosystem to grow.”
Whether the Open Compute Project will work like open source does in software and produce the innovations Facebook is hoping for depends on whether three underlying assumptions pan out. Specifically, the project assumes that if you make information available about how to do something better, companies will put that information to use. It also assumes that they won’t stop there, but will go further and try to build on top of what you’ve created. And finally it assumes that they will turn around and share those innovations back with others in the industry–including, of course, potential competitors.
Many at the Facebook event Thursday seemed convinced of the first part, if simply because it means companies will be able to take advantage of the massive cost savings Facebook has discovered. The company’s revamped data center, complete with redesigned servers, is 38% more energy efficient and 24% more cost effective to build than the industry’s current state-of-the-art.
On the other hand, the innovations Facebook came up with are perfectly optimized for Facebook’s needs. Servers are like cars. They’re not necessarily one-size fits all. They’re configured to serve the specific computing needs of specific categories of users. Facebook’s servers will work great for Facebook, but not necessarily for the hundreds of thousands of other companies that need computing power.
To wit, Zynga CTO of Infrastructure Engineering Allan Leiwand, who sat on a panel at the Facebook event and whose own company is collaborating on developing the next phase of hardware improvements, resisted committing definitively to adopting the system unveiled Thursday, despite being pressed several times by panel host Om Malik. Leiwand was only willing to go so far as to say, “There is a great leap forward here,” and “We are looking at using it in our own data centers,” with the emphasis on the “looking.”
The key question, though, is whether data center and server companies that do incorporate certain parts of Facebook’s innovations will actually try to take them further. The industry has traditionally taken a conservative stance in investigating new approaches. And despite the growing emergence of companies like Facebook with new sets of needs, like massive clouds of integrated servers that can do huge amounts of real-time computing, many providers are still solving more for their traditional customer base: Small or medium-sized businesses that want to manage a few servers on their own.
When Google faced a similar problem–how to stimulate innovation in a hardware industry critical to its long-term success–it took a different approach. It dove into the industry and acted like a competitor. In Fast Company’s cover article this month on how new CEO Larry Page will define the company’s future, Farhad Manjoo writes about how Google entered the smartphone business not so much because it wanted to start manufacturing handsets but rather to “wake up drowsy competitors.” Likewise, with its plan to roll out superfast Internet service to several American cities. “Google really wants Verizon and others to pick up the pace,” Manjoo writes. “And when those rivals do, Google will benefit from the innovations that result.”
Certainly, curious and creative minds in the data center and server industries will tinker with what Facebook unveiled Thursday. But whether whole companies will get cracking on making further breakthroughs–in the absence of a can-do-no-wrong 800-pound gorilla like Facebook actually threatening to stomp into their market–is less clear.
The last question is the one that is most in doubt. The data center and server industries have been notoriously protective of their intellectual property, so much so that Facebook vice president of technical operations Jonathan Heiliger nicknamed it “Fight Club,” in a nod to the famous quote from the 1999 movie: “The first rule of Fight Club is you do not talk about Fight Club.”
If Facebook is hoping the hardware business will be inspired by software’s open source movement, it might be disappointed. In software, a single creative individual needs little more than a computer, an Internet connection, and the exercise of their own brain cells to make a contribution.
Hardware, however, is more complicated and requires greater resources. That’s why you rarely see single individuals building new pieces of hardware on their own. It usually requires the combined resources of a company to produce meaningful results.
Facebook’s own project, for example, took “tens of millions of dollars,” director of hardware design Frank Frankovsky tells Fast Company. Among other things, the company built a hardware lab in the shipping bay of its Palo Alto headquarters, complete with a wind tunnel and ovens to test the servers they were building. And that’s not even taking into account the investments that went into its researching innovative ways to cool a data center, whose results were incorporated into Facebook’s new, homegrown data center in Prineville, Ore.
So while other companies might be happy to grab some of the specs Facebook released Thursday for their own products and services, and while they might even be motivated to innovate on top of them, whether they will happily turn over the fruits of their millions of dollars of investment and their months, if not years, of research, is questionable.
From Facebook’s point of view, however, this last point is probably the least urgent. The company would, of course, prefer that players in this space share their discoveries. And they certainly believe that the innovation they need will come about faster if the industry collaborates. But at the end of the day, Facebook simply wants the innovations to happen. They need faster and cheaper data processing to enable them to bring to life the myriad ideas sitting on their drawing boards. As long as some kind of innovation comes about, proprietary or otherwise, Facebook will be content. And that they will probably get, in some form or another.