In a paper posted today on arXiv 1910.09534, a team of IBM scientists explain how Google's 53-qubit quantum computing supremacy experiment should only take 2.5 days to simulate on SUMMIT, Oak Ridge National Lab's supercomputer. Well, that's a quite a discrepency from Google's claim stating it would take over 10000 years to perform the computation.
But what does it say about Google's supremacy claim in itself ?
First note that IBM is not arguing about the quantum computation Google's team has performed. They are arguing about the time required to perform an equivalent classical simulation using the world's most powerful supercomputer. Google said 10000 years, IBM – who happened to build the supercomputer – say 2.5 days if you optimize your computation using secondary storage.
This technique amounts to break up the classical simulation into pieces and off-loading non active parts to disk, while keeping the active ones in RAM.
Here, what's important is that IBM does not say they have found a better (i.e less complex) way to simulate Google's experiment, but a faster one. They are still doing about the same amount of exponential work. Their technique allows to lower the constant timing factors and in turn improve the compute time.
However, if you increase by just a few qubits the size of the quantum computation to be simulated, you are going back to the previous situation of a very long computation, because the dependency with the number of qubits is still exponential. The paper actually shows results for a 54-qubit quantum computation where the simulation time is at about 8 days. Hence, adding a single qubit more than doubles the time for simulating the experiment with 53 qubits: still exponential.
While the debate about supremacy might live in the media, it is about the same as saying that the cost of electricity produced by renewable sources is at grid parity: it's a crossing point where economics are changing and pushing in favor of one source of energy over another. As for supremacy, the cost of electricity produced is also highly dependent on some hypotheses (e.g amortization rules), hence, it's more a benchmark than anything that you would rely on to do any serious investment.
Talking about investment: SUMMIT was backed by a $325 millions contract and is taking 2.5 days to run a simulation that can be done in about 3min and a half on Google's machine. Does it mean ORNL would have paid $325 billions for Google's machine? (yeah, 2.5 days is still about 1000 times longer than 3.5 minutes.)
Comments powered by Disqus