(Paris) Google scientists have announced a new breakthrough towards the future quantum computer, with an experiment that significantly reduces the error rate, a major obstacle to this “Holy Grail” of computing, according to a study published Wednesday in Nature.
The universal quantum computer is called upon to radically transform computing, with gigantic computing power, beyond comparison with conventional machines.
But this revolutionary machine, considered a “Grail” by computer scientists, is not for tomorrow, because many technological obstacles remain to be overcome, in particular an error rate far too high to develop reliable applications.
An experiment conducted by the Google Quantum AI research department has shown that error correction can be significantly improved, which allows the digital giant to advance its pawns in the race for the quantum computer.
Superconducting quantum processors, like those from Google or its rival IBM, exploit the amazing properties of quantum physics that govern the world on the scale of the infinitely small.
They use quantum bits called qubits: these are the basic building blocks of quantum computing which have an infinity of possible states that can overlap (0 and 1 at the same time) and become entangled, whereas the bits of classical computers only have two possible states (0 or 1).
Superposition and entanglement make it possible to perform massive mathematical operations in parallel, because the computer is able to explore all the solutions at the same time and not one by one.
“Toggle”
But the manipulation of qubits comes up against a major physical obstacle called decoherence, which makes quantum properties disappear on contact with the slightest external disturbance. It is this fragility that generates a high error rate, and all the more troublesome as the more qubits there are, the more these errors increase.
Google has tested a method using correcting codes capable of detecting and correcting errors without affecting the information.
This system, intended to improve the logical performance of the machine, was theorized at the end of the 1990s, and is therefore not new. Except that putting it into practice gives the opposite effect than that hoped for: the more qubits there are, the more the size of the correction code must increase and… the more the correction performance decreases.
Google scientists say they have for the first time developed error-correcting code that reverses this process. “It’s a seesaw, the magic of the correction has taken place”, commented the Dr Hartmut Neven, the lead author, at a press conference.
“But that’s not enough, we now need to achieve a much lower error rate,” conceded the scientist. There is therefore still a long way to go before the technology is useful.
In 2019, Google claimed “quantum supremacy”, saying that its Sycamore processor had completed a calculation in 3 minutes that would have taken a conventional supercomputer more than 10,000 years. The announcement was then disputed, in particular because the calculation made then served no purpose other than to achieve this victory.