Quantum computing is not easy. But researchers at IBM recently announced that they had taken a step toward solving one of its biggest challenges: developing a better way to detect and correct annoying errors. In a blog post, Mark Ritter, who oversees scientists and engineers at IBM’s T.J. Watson Research Laboratory, wrote: “I believe we’re entering what will come to be seen as the golden age of quantum computing research.” His team, he said, is “on the forefront of efforts to create the first true quantum computer.”
First, what that would mean: A quantum computer harnesses the science of the very small—the strange behavior of subatomic particles—to solve problems that are computationally infeasible for a classical computer or simply take too long. How molecules interact at the quantum level, for example, is difficult to study in a laboratory and impossible to simulate on a classical computer but could be simulated on a quantum computer.
“This (quantum simulation) has potential for things like drug discovery, drug design, chemical design, and hopefully applications in the bio-pharma realm,” says Jerry Chow, manager of IBM’s Experimental Quantum Computing group. A quantum computer could also crack the most sophisticated encryption in use today. The NSA has been investing in quantum computing research for this very reason.
Since the 1990s quantum computers have existed in, well, a quantum state, at once a highly theoretical field of physics and mathematics and a concrete engineering challenge progressing in fits and starts. Academic research labs around the world, governments, and companies including Google, Microsoft, and Lockheed Martin have been working on the basic building blocks of a quantum computer for some years. The Canadian company D-Wave claims to have already built one, but many researchers, including those at IBM, are skeptical about how “quantum” it really is.
A classical binary bit is always in one of two states—0 or 1—while a quantum bit or qubit exists in both of its possible states at once, a condition known as a superposition. An operation on a qubit thus exploits its quantum weirdness by allowing many computations to be performed in parallel. A two-qubit system would perform the operation on 4 values, a three-qubit system on 8 and so forth.
Rather than performing each calculation in turn on the current single state of its bits, as a classical computer does, a quantum computer’s sequence of qubits can be in every possible combination of 1s and 0s at once. This allows the computer to test every possible solution simultaneously and to perform certain complex calculations exponentially faster than a classical computer.
But there’s a catch. One curious feature of a qubit is that measuring it causes it to “collapse” into a single classical known state–0 or 1 again–and lose its quantum properties. A quantum calculation ends with a measurement of the entire sequence of qubits to yield a solution.
Many quantum algorithms are non-deterministic; they find many different solutions in parallel, only one of which can be measured, so they provide the correct solution with only a certain known probability. Running the calculation several times will increase the chances of finding the correct answer but also may reduce quantum computing’s speed advantage.
Most researchers agree that many challenges remain in the quest to build a practical quantum computer. In a paper published in Nature, Chow’s team described its progress in tackling one of those challenges, by designing a way to detect errors on a two-by-two lattice of superconducting quantum bits.
If there are errors in the underlying data stored by any computer then the results of its calculations will be incorrect. Errors rarely occur in the transistors used to build classical computers, and when they do, they are automatically fixed by various error-correction schemes.
Quantum computers are a different story. “Qubits are really susceptible to errors,” says Chow. “They can be affected by heat. They can be affected by noise in the environment. They can be affected by stray electromagnetic couplings.”
Only one type of error can occur in the information stored by a classical computer, a bit-flip, where a 0 is mistakenly flipped to 1 or vice versa. Qubits suffer from bit-flits but also from phase errors. A superposition state of a qubit, or having the values 0 and 1 at the same time, is denoted as “0+1”. A phase error flips the sign of the phase relationship between 0 and 1.
“0+1 and 0-1 are very different in terms of the information that’s in that state,” explains Chow. “We have to think of it as an arrow pointing along a sphere. You can point at the south pole and that’s a zero. You can also be pointing at the north pole and that’s a one. You can point along the equator and that’s a 0+1 but if you point to the exact opposite side of that equator, it’s a 0-1.” To make things even more complicated, quantum error correction schemes have to avoid measuring qubit data directly since that will cause the value to collapse.
IBM’s new error-detection scheme is based on a technique called surface code which spreads quantum information across many qubits. Two syndrome (or measurement) qubits are coupled with two code, or data qubits. One syndrome qubit reveals whether a bit-flip error has occurred to either of the code qubits, while the other syndrome qubit flags the case where a phase-flip error occurred, all without directly measuring either of the qubits.
But error correction is just one of the obstacles on the rocky road to building a practical quantum computer.
One professor posted a lengthy list of those obstacles on Quora. One of them is the difficulty of coherence. A common metric for the quality of a qubit is coherence time, or how long it retains its quantum properties. A robust and fully functional quantum computer needs to have a long coherence time. That is still a long way off. In 2014, researchers at the University of New South Wales claimed a new world record when they created two new types of qubits that could retain their quantum state for a full 35 seconds.
“For quantum error correction to work you need the individual qubits to already be above a certain quality,” says Chow, “In order to make these individual qubits better and better, there need to be a lot of developments in terms of materials, how we lay out all these devices and build them into an actual processor.”
Unlike classical computers, there are no standard materials or architecture for the quantum version. Qubits are currently constructed in a variety of ways from ion traps (charged atomic particles) and electrons in silicon to the superconducting circuits used by Chow’s group.
In order to achieve long coherence times, qubits need to be isolated from the external world, often in sub-zero temperatures. Isolation makes it difficult to control the computer effectively, since this necessarily involves contact with the outside world. Achieving control and coherence is expensive.
To achieve those long-lasting qubits, Australian researchers manipulated a single phosphorus atom entombed in a silicon crystal using $100,000 high-frequency oscillating magnetic field generators and a simple electrical pulse to modify the frequency of the atom’s electrons. “Therefore, we can selectively choose which qubit to operate,” explained Andrea Morello, one of the researchers. “It’s a bit like selecting which radio station we tune to, by turning a simple knob. Here, the ‘knob’ is the voltage applied to a small electrode placed above the atom.”
The design has since been updated to control multiple qubits, and last month the lab reported new progress in manufacturing the silicon crystal, by using only a thin layer of specially purified silicon, an advance that could drastically reduce the time and cost of hardware development.
So far, however, nobody has managed to perform more than a few quantum logical operations on a handful of qubits before hitting the coherence time wall. The more qubits are connected, the higher the possibility that a quantum computer will start to behave like a classical one. But more complex problems cannot be solved without a large number of qubits.
“To build a quantum chip that looks like today’s processors,” says Chow, “that’s going to require a lot of engineering and a lot of understanding of different materials and how they behave in the quantum world.”
Another issue is that the set of problems that a quantum algorithm can solve much faster than a classical one may be limited. Since many quantum algorithms are non-deterministic, you need some way to verify if the measured answer is correct. When calculating the factors of a prime number, for example, it’s easy to check the result. But many problems have solutions which are not so easily verified.
Even when the solution is verifiable, you may need to run the same calculation several times to reach the correct solution, thereby reducing the speed advantage. Researchers in Vienna are tackling this problem by inserting short intermediate calculations, whose answers are known, into the calculation. This gives the user a measure of the reliability of the machine. Other quantum algorithms exploit a phenomenon called interference to increase the likelihood that a single run will yield the correct answer.
IBM’s Chow remains sanguine about the obstacles ahead, and particularly about the challenge of creating the holy grail of a “logical qubit,” which is built using physical qubits but which does not lose its information and is error corrected. “A lot of these problems will be solved in the next few years and that will help us get to where we can demonstrate logical qubit encoding. Then we can take a step toward some true quantum algorithms on top of that logical layer.”
And the way that we all use computers—scientists, cryptographers, data crunchers, Internet searchers—will be just a little bit closer to a quantum leap.