Quantum computational powers set to surpass top-tier supercomputers' performance, affirm scientists.
Ready, Set, Quantum! The Great Race to the Future of Computing
The latest quantum computing benchmark has lit a fuse under the competitors, pinpointing the strengths and vulnerabilities of various quantum processing units (QPUs).
This high-stakes showdown, led by a team at the Jülich Research Centre in Germany, pitted 19 QPUs from five suppliers - including IBM, Quantinuum, IonQ, Rigetti, and IQM - against each other. The goal? To determine which chips stood tallest in the arena of high-performance computing (HPC).
These quantum systems were put through the paces at different "widths" (the total number of qubits) as well as "depths" for 2-qubit gates. The gates are operations that act on two entangled qubits simultaneously, with depth measuring the length of a circuit - its complexity and execution time.
IBM's QPUs displayed a commanding advantage when it came to depth. Meanwhile, Quantinuum stole the show in the width category, outperforming its rivals where larger numbers of qubits were involved. IBM's QPUs also showed marked improvement in performance across successive iterations, most prominently between the earlier Eagle and more contemporary Heron chip generations.
These insights, unveiled in a study posted on the preprint arXiv database on Feb. 10, hint at areas where both hardware and software improvements can reap benefits. Specifically, the advancements in firmware and integration of fractional gates - custom gates available on Heron - appeared to play a significant role in cutting circuit complexity.
However, the latest version of the Heron chip, dubbed IBM Marrakesh, failed to meet expectations despite boasting half the errors per layered gate (EPLG) compared to IBM's previous QPU, IBM Fez.
Quantum Leap Ahead
Smaller companies have made substantial gains, too. Importantly, one Quantinuum chip exceeded the benchmark at a width of 56-qubits. This is noteworthy because it represents a breakthrough, enabling a quantum computing system to outdo existing classical computers in specific contexts.
Join the Quantum Daily Brief
Be the first to know the world's awe-inspiring quantum discoveries. Sign up for our daily newsletter!
Related: ** China steps into quantum supremacy with new chip rated 1 quadrillion times faster than leading supercomputers**
"In the case of Quantinuum H2-1, the experiments of 50 and 56 qubits are already beyond the capabilities of exact simulation in HPC systems, and the results remain relevant," the researchers wrote in their preprint study.
Specifically, the Quantinuum H2-1 chip generated results at 56 qubits, completing three layers of the Linear Ramp Quantum Approximate Optimization Algorithm (LR-QAOA) - a benchmarking algorithm - involving 4,620 two-qubit gates.
"To the best of our knowledge, this is the largest implementation of QAOA to solve an FC combinatorial optimization problem on real quantum hardware that offers better results over random guessing," the scientists said in the study.
IBM's Fez took on the highest depth of the systems tested. In a test that involved a 100-qubit problem using up to 10,000 layers of LR-QAOA (nearly a million two-qubit gates) Fez retained some coherent information until nearly the 300-layer mark. The low-performing QPU in testing was the Ankaa-2 from Rigetti.
The team devised the benchmark to evaluate a QPU's potential for practical applications. Their aim was to create a test with a clear, consistent set of rules, easy to run, universally compatible, and supplying actionable performance metrics.
Their benchmark is based on the MaxCut problem, a test that presents a graph with several vertices (nodes) and edges (connections), and requests the system to split the nodes into two sets so that the number of edges between the two subsets is maximized.
This is handy as a benchmark because it is computationally very tough, and the difficulty can be ramped up by increasing the size of the graph, the scientists said in the paper.
-The race for the coldest qubits could lead to swifter quantum computers
-The world's first modular quantum computer capable of operating at room temperature enters service
-Newly discovered quantum state may power more stable quantum computers – and a new 2D chip can tap into it
A system was deemed to have faltered the test when the results reached a mixed state – indistinguishable from those of a random sampler.
Because the benchmark relies on a testing protocol that's relatively straightforward and scalable, and can produce meaningful results with minimal samples, it's relatively inexpensive to execute, the computer scientists added.
This new benchmark isn't without flaws. Performance is dependent on fixed schedule parameters, meaning that parameters are pre-set and not adjusted dynamically during computation. The researchers suggested that in addition to their own test, "different candidate benchmarks to capture essential aspects of performance should be proposed, and the best of them with the most explicit set of rules and utility will emerge."
Science has shown that advancements in data-and-cloud computing technology, such as the integration of fractional gates and improvements in firmware, can significantly improve the performance of quantum processing units (QPUs) in high-performance computing (HPC). however, the latest medical-conditions, like errors per layered gate (EPLG), still pose challenges for companies like IBM, even when newer chip generations like Marrackesh are introduced.