. @IBM is investing $3 billion in R&D into quantum computing + microchips that mimic human thinking: http://www.fastcompany.com/3032872/ibms-3-billion-investment-in-synthetic-brains-and-quantum-computing by @nealunger

IBM's $3 Billion Investment In Synthetic Brains And Quantum Computing

IBM thinks the future belongs to computers that mimic the human brain and use quantum physics...and they're betting $3 billion on it.

IBM is unveiling a massive $3 billion research and development round on Wednesday, investing in weird, science fiction-like technologies—and, in the process, essentially staking Big Blue’s long-term survival on big data and cognitive computing.

Over the next five years, IBM will invest a significant amount of their total revenue in technologies like non-silicon computer chips, quantum computing research, and computers that mimic the human brain.

The $3 billion funding round will go towards a variety of projects designed to catapult semiconductor manufacturing past what IBM physical sciences director Supratik Guha calls the "end of silicon scaling" in microchips. Essentially, IBM believes there will be a point in the medium-term future where microchips will no longer be made out of silicon because other materials will allow for faster and more complex computation. In a telephone conversation, Guha told Fast Company that his company sees an end to silicon scaling within the next three to four tech generations.

The new R&D initiatives fall into two categories: Developing nanotech components for silicon chips for big data and cloud systems, and experimentation with "post-silicon" microchips. This will include research into quantum computers which don’t know binary code, neurosynaptic computers which mimic the behavior of living brains, carbon nanotubes, graphene tools and a variety of other technologies.

IBM’s investment is one of the largest for quantum computing to date; the company is one of the biggest researchers in the field, along with a Canadian company named D-Wave which is partnering with Google and NASA to develop quantum computer systems.

The news of the funding round is surprising to some IBM-watchers; for the past year or so, rumors flew that IBM was considering an exit from the microchip business. Barring an unlikely sharp turn by Big Blue in business strategy, the funding round is essentially a shift of strategy by IBM. While it looks likely that IBM will sell their chip unit—the New York Times cites GlobalFoundries as a likely buyer—the investment means that IBM sees a new way to make money from chips: Long-term returns from holding valuable, potentially lucrative patents and intellectual property.

"The point of this announcement is to underscore our commitment to the future of computing," Guha told Fast Company. "As you probably know, silicon technology has taken us a long way. A lot of stuff you see around you is a result of our ability to scale silicon tech, but the community at large realizes the end of silicon scaling is coming. However, performance scaling in computer system will continue in various ways; our R&D efforts are focused on different ways and means by which we do so."

Of all the investments announced in the round, neurosynaptic chips are the most novel. Essentially low-power microchips designed to mimic the behavior of the human brain, IBM has been researching the feasibility of building technology that can mimic human cognition for years. IBM is believed to be building a new programming language around the chips, which will be used for machine learning and cognitive computing systems like Watson. Some proof-of-concept neurosynaptic computing projects IBM announced previously include oral thermometers which identify bacteria by their odor and "conversation flowers" placed on tables which automatically identify speakers by voice and generate real-time transcripts of conversations, rendering transcriptionists obsolete.

[Image: Flickr user Sarah]

Add New Comment


  • Brad Arnold

    Abandoning silicon is a no-brainer, and neomorphic architecture is a promising frontier. I would rate IBM's decision as rational rather than risky. The Singularity is coming, and ASI is not only the holy grail of computing, but an inevitability. Few people comprehend the ramification of the emergence of ASI.

  • John Sellers

    I think that quantum computers are a fiction that will never happen.

    I believe that as soon as you get to the point where you are attempting tasks that conventional computers can not do, you have gotten to the point that the conventional statistics necessary to get the answer from the quantum computer are not up to the job of separating the answer from the overwhelming scope of the associated superimposition.

  • Not fiction at all. Quantum computers and quantum computing is already here. Entangled systems will be common with every desktop computer. The tasks that we are currently processing with conventional computers were at one time not even thinkable.....

  • John Sellers

    They are not here. As far as I know the record entanglement is 14 qubits. That comes to 2 raised to the 14th power superimposed states and can be duplicated by a conventional computer with about 16000 CPUs. Such trivial results are of only mild interest. That is the record entanglement and no computer has been built that has demonstrated even 14 entangled qubiyd

    The real payday would not come until you have something that could factor numbers like 10 raised to 300 power. To express such a number in a quantum computer would require an entanglement of about 1000 qubits.

    After you got such a computer, how do you read the answer? Well the entanglement is holistic so you can't just read out the qubits. Instead you have to analyze the entanglement as an entity. Ooops.....to read out a 300 digit number you would have to have an error rate with error correction which gives a precision of one part in 10 raised to the 300 power.

    That is a intimidating task to say the least.

  • David Pease

    They've already built quantum computers. It's years away until they become really useful, but given time it WILL happen, (if the economy endures). It's not fiction.