In October, Google claimed the milestone of quantum supremacy using the bizarre physics of quantum mechanics to solve a mathematical task that would be impossible—or at least damn near impossible—with even the biggest traditional supercomputers. (Rival IBM quickly disputed this.) Google achieved that feat on a machine with 54 quantum bits, or qubits, which use subatomic particles to encode two or more values at once.
Intel, meanwhile, won’t even say how many qubits it’s gotten to work on its latest silicon-based quantum chip (although an analyst puts it at 26). However, the chipmaker claims that its brand of technology, based on similar chipmaking it uses for traditional computers, will ultimately be able to pack qubits much more densely than chips that Google and other rivals like IBM and Rigetti use.
Today Intel is announcing another miniaturization feat—a controller chip about the size of a hand that replaces large external units typically required to control the qubits. To understand how that works and why it’s potentially significant, let’s take a quick look inside a quantum computer.
All of these systems take advantage of uncertainty about the state of a subatomic particle. A photon, for instance, might be polarized in one of two ways, but the very act of measuring the photon will affect its polarization. Because of this uncertainty, the subatomic particles appear to exist in all possible states at once—called superposition—until you actually take a measurement, at which point a single state emerges.
Maintaining a particle in superposition requires chilling it very close to absolute zero and manipulating it with microwave pulses—two factors that explain why quantum computers are so bulky. They have to reside in refrigeration tubes that would be big enough to hold a person, out of which sprout cables that connect the external microwave generators, giving the whole setup a bit of a steampunk look.
A reliable quantum computer will require way more than a few dozen qubits.”
Higher temperature is relative: Intel’s quantum chip can operate at or slightly above 1 kelvin—about two degrees Fahrenheit above absolute zero. But the temperature is high enough that a tiny bit of heat generated by the Horse Ridge controller chip inside the fridge won’t ruin the silicon spin qubits, says Clarke.
Mistakes will be made
Miniaturization is important because a reliable quantum computer will require way more than a few dozen qubits. Because the superposition state is so delicate and short-lived, it can often fail. The way to overcome that is to use many individual qubits in place of a single one for redundancy, so that one or two errors won’t ruin the calculation. Thus to be practical, quantum chips may require anywhere from thousands to millions of qubits, depending on whose estimate you go with. That won’t scale with having a bundle of control cables coming off each chip, says Clarke.
It’s a good idea to maintain some healthy uncertainty around Intel’s claims. For one, it’s unsurprising for a company that’s trailing competitors to proclaim that others are doing it wrong. And for all of Intel’s decades of dominance in the market for chips for personal computers and servers, it’s had trouble transferring that success to other areas. Intel’s mobile processors failed to gain a hold in the smartphone and tablet industry, for instance, and it recently abandoned its work producing 5G smartphone modems.
But experts and even some of the biggest boosters of quantum tech say there’s a long way to go—years or possibly decades—from successful demonstrations of quantum computing’s potential to an era in which it powers reliable everyday business applications. For now, in other words, various opposing narratives about the future of the technology may exist at the same time.