Tuesday, November 23, 2021

"Math may have caught up with Google’s quantum-supremacy claims"

From Ars Technica, November 23:

But, given the rapidly evolving quantum computing landscape, that may not matter.

In 2019, word filtered out that a quantum computer built by Google had performed calculations that the company claimed would be effectively impossible to replicate on supercomputing hardware. That turned out to not be entirely correct, since Google had neglected to consider the storage available to supercomputers; if that were included, the quantum computer's lead shrank to just a matter of days.

Adding just a handful of additional qubits, however, would re-establish the quantum computer's vast lead. Recently, however, a draft manuscript was placed on the arXiv that points out a critical fact: Google's claims relied on comparisons to a very specific approach to performing the calculation on standard computing hardware. There are other ways to perform the calculation, and the paper suggests one of those would allow a supercomputer to actually pull ahead of its quantum competitor.

More than one road to random

The calculation Google performed was specifically designed to be difficult to simulate on a normal computer. It set the 54 qubits of its Sycamore processor in a random state, then let quantum interference among neighboring qubits influence how the system evolves over time. After a short interval, the hardware started repeatedly measuring the state of the qubits. Each individual measurement produced a string of random bits, making Sycamore into a very expensive random-number generator. But if enough measurements are made, certain patterns generated by the quantum interference become apparent.

Because the rules of this quantum interference are understood, it's also possible to calculate the patterns we should see in the random numbers generated by Sycamore. But doing those calculations is very computationally expensive and gets more expensive with each additional qubit in the system. Google estimated that it would take an unrealistically long time to perform these calculations on the world's most advanced supercomputers.

The one flaw with this argument that was pointed out early in the process was that Google neglected to consider the storage attached to the world's then-largest supercomputer, which would significantly cut into Sycamore's lead. But the reality remained that these computations were very difficult for classical computing hardware.

The new manuscript focuses on a key aspect of that phrase: these computations. Google chose a very specific method of computing the expected behavior of its processor, but there are other ways of doing equivalent computations. Over the intervening time, a few options have been explored that do perform better. Now, Feng Pan, Keyang Chen, and Pan Zhang are describing a specific method that allows a GPU-based cluster to produce an equivalent output in only 15 hours. Run it on a leading supercomputer, and they estimate that it would outperform the Sycamore quantum processor.....

....MUCH MORE

Previously:

May 2020
"Inside big tech’s high-stakes race for quantum supremacy"
Not into quantum computers? Then you've made the decision to be a peasant, working for those who are.
And the rich get richer.
See: How to Think About Companies: "Advantage Flywheels".

November 2021

"Two of World’s Biggest Quantum Computers Made in China"

October 2020
The Market for Quantum Technology: Early Revenue-Generating Applications

Sept 2019
"Ghost post! Google creates world’s most powerful computer, NASA ‘accidentally reveals’ ...and then publication vanishes"

March 2018

"Google Chases Quantum Supremacy with 72 Qubit Processor"

October 2019

"A new era of computing could bring about a 'quantum apocalypse'"
"'Our modern systems of finance, commerce, communication, transportation, manufacturing, energy, 
government, and healthcare will for all intents and purposes cease to function,' cyber security expert warns"
Huh.

And dozens and dozens more