Supercomputing Bridging The Gap Against Quantum Advantage

Supercomputing-is-bridging-the-gap-against-quantum-advantage

Hewlett Packard Enterprise announced new research published in Science Advances that demonstrates how supercomputers can be used to test and benchmark computational performance for the quantum computing community, redefining theoretical performance claims that future quantum computers will deliver.

The research results reveal how a problem, called Gaussian Boson Sampling (GBS), which is considered to be a domain of quantum computing, was achieved using high performance computing (HPC), or supercomputing, expanding the boundary of problems that supercomputers can address.

In the new research, team members with HPE’s HPC and AI Business Group and Hewlett Packard Labs, HPE’s R&D arm, collaborated with the University of Bristol and Imperial College London to improve a previous prediction that it would take 600 million years to simulate a Gaussian Boson Sampling problem of the same size as an experimental quantum computer, on the world’s largest supercomputer.

After developing an algorithm and applying it to a simulation of the GBS problem that ran on smaller, older generations of HPE-built supercomputers, the teams used the simulation results to predict it would take just 73 days on an even faster supercomputer. The novel algorithm represents a billion-fold speed-up compared to previous approaches for classical computers.

“Today’s research, a result of a strong collaboration between teams at HPE, University of Bristol and Imperial College London, was inspired by the leading edge of quantum computing development to extend the value that supercomputing delivers, when combined with optimised algorithms, to accurately compare computational advantage between classical computers and quantum computers, and set new standards of performance,” said Justin Hotard, senior vice president and general manager, HPC and AI at HPE.

“We look forward to furthering this effort by partnering with the quantum computing community and integrating the HPE Cray supercomputer product line with other enabling technologies to advance the journey to developing future quantum computers.”

The latest experiment showcases the increasing value of supercomputing and how it can be used to test and support current or near-term quantum experiments that help to accelerate the commercial relevance of quantum computers. The research also predicts that as supercomputing continues to advance, such as with upcoming exascale supercomputers that are up to 10 times faster than today’s most powerful supercomputers, quantum computing results can be verified in even shorter windows of time, from months to weeks on faster systems.

HPE’s latest research outcome is a powerful example of the sustained value of HPC and the potential for novel algorithms in classical and quantum computing. HPE and teams were inspired by claims made in a previous paper, the Quantum Computational Advantage Using Photons, from the University of Science and Technology of China (USTC). In the paper, USTC’s researchers share findings from an experiment involving a large, complex quantum state of light that was measured using single photon detectors in the Gaussian Boson Sampling (GNS) protocol. USTC predicted that their simulation of GBS, which they performed on a single-purpose photonic quantum computer, in 200 seconds, would take 600 million years to simulate on the world’s largest supercomputer.

HPE researchers applied an algorithm that calculated exact, correlated photon detection probabilities for GBS simulations. The researchers first ran the simulations on GW4’s Isambard supercomputer and an HPE supercomputer that HPE internally uses as a test system. The simulations on these systems were then used to predict that it would take an estimated 73 days to run on today’s fastest supercomputer, and an estimated three weeks on an exascale supercomputer.