Quantum computing has been touted as a revolutionary advance that uses our growing scientific understanding of the subatomic world to create a machine whose capabilities far exceed those of traditional computers.
Google scientists said on Wednesday they have reached an important milestone in their quest to develop an effective quantum computer. A new study shows they’ve reduced the error rate — long an impediment to the much-vaunted technology.
Quantum computing has been touted as a revolutionary advance that uses our growing scientific understanding of the subatomic world to create a machine whose performance far exceeds that of today’s conventional computers.
However, the technology remains largely theoretical, and many thorny issues still stand in the way — including stubbornly high error rates.
In new research published in the journal Naturethe Google Quantum AI Lab described a system that can significantly reduce the error rate.
That could give the US tech giant a leg up on rivals like IBM, which are also working on superconducting quantum processors.
While traditional computers process information in bits, which can be represented by 0 or 1, quantum computers use qubits, which can be a combination of both at the same time.
This property, known as superposition, means that a quantum computer can process an enormous number of potential outcomes at the same time.
The computers are harnessing some of the most intriguing aspects of quantum mechanics, including a phenomenon known as “entanglement” – where two members of a bit pair can exist in a single state, even if they are far apart.
‘Magic’
But a problem called decoherence can cause the qubits to lose their information when they leave their quantum state and come into contact with the outside world.
This fragility leads to high error rates that also increase with the number of qubits, frustrating scientists trying to ramp up their experiments.
However, Google’s team said it has shown for the first time in practice that a system using error-correcting code can detect and fix errors without degrading the information.
The system was first theorized in the 1990s, but previous attempts had only thrown up more bugs, not fewer, said Google’s Hartmut Neven, a co-author of the study.
“But if all components of your system have sufficiently low error rates, then the magic of quantum error correction kicks in,” Neven said at a press conference.
Julian Kelly, another co-author of the study, hailed the development as “an important scientific milestone” and said that “quantum error correction is the most important single technology for the future of quantum computing”.
Neven said the result was still “not good enough, we need to get a really low margin of error”.
He added that “there are more steps to be taken” to make the dream of a viable quantum computer a reality.
Google claimed in 2019 it had reached a milestone known as “quantum supremacy,” when the tech giant said its Sycamore machine performed a computation in 200 seconds that would have taken a conventional supercomputer 10,000 years to complete.
However, the performance has since been controversial, with Chinese researchers saying last year that a supercomputer could have beaten Sycamore’s time.
More information:
Quantum error suppression by scaling a logical surface code qubit, Nature (2023). DOI: 10.1038/s41586-022-05434-1
On the way to error-resistant quantum computers, Nature (2023). DOI: 10.1038/d41586-022-04532-4
© 2023 AFP
Citation: Google Hails “Key Milestone” in Quantum Computing (2023, February 23) Retrieved February 23, 2023 from https://phys.org/news/2023-02-google-hails-key-milestone-quantum.html
This document is protected by copyright. Except for fair trade for the purpose of private study or research, no part may be reproduced without written permission. The content is for informational purposes only.