Google just made a leap for quantum computing
In a paper published Wednesday in the journal Nature, a team of researchers from Google and several academic institutions describe a method that they call "quantum annealing with a digital twist." Essentially, they combined the quantum annealing approach with the "gate" model of quantum computing and found that they could get the best of both worlds.
Quantum capabilities are widely anticipated for the giant gains they're expected to deliver in performance and efficiency. Much of that derives from what's known as superposition. While the bits used by traditional computers represent data as 0s or 1s, superposition allows qubits -- the quantum equivalent of a bits -- to be both 0 and 1 at once.
IBM is one of the best-known companies associated with quantum computing today, not least because of its big announcement a few weeks ago of the five-qubit quantum processor it's developed and plans to make available via the cloud. To create that technology, IBM used the gate model in which qubits are linked together to form circuits. One of the key advantages of that approach is that it includes error correction.
A competing model, used by quantum specialist D-Wave, uses quantum annealing. Known also as the adiabatic approach, this method focuses on finding and maintaining the lowest energy state in a gradually evolving quantum system.
Now, in the researchers' combined approach, they essentially take the adiabatic approach and add the error-correction capabilities of the gate model. In their experiment, they tested it out on a simulated system using nine qubits in which each is connected to its neighbor and individually controlled. It's depicted in the video above, which shows the qubits as yellow crosses that turn blue when they interact.
"The crucial advantage for the future is that this digital implementation is fully compatible with known quantum error correction techniques, and can therefore be protected from the effects of noise," Rami Barends and Alireza Shabani, quantum electronics engineers with Google, wrote in a blog post.
The result is "a general-purpose algorithm which can be scaled to an arbitrarily large quantum computer," they said.