The theme of this year’s April Meeting of the American Physical Society is the “Feynman Century” because the iconoclastic, Nobel-prize-winning physicist was born in 1918. This morning at a special session devoted to Feynman, quantum computing expert Christopher Monroe of the University of Maryland spoke about early contributions to quantum computing that were made by Feynman before his untimely death in 1988.
That theme continued in an afternoon session at the conference where nuclear and particle physicists discussed how quantum computers could be applied to their work. A huge challenge to those studying the physics of quarks (quantum chromodynamics or QCD) is that it takes vast amounts of computing power just to calculate the properties of relatively simple systems.
Low barrier to entry
Quantum computers, which (at least in principle), can solve certain problems much more efficiently than conventional computers could offer a way forward. Earlier this year we reported what is probably the first-ever nuclear physics calculation done using quantum computers – the binding energy of the deuteron. Thomas Papenbrock of Oak Ridge National Lab in Tennessee explained how commercial cloud quantum-computing services from IBM and Rigetti had made this calculation possible, pointing out that the barrier to entry to quantum computing is very low thanks to these services.
He was followed by Martin Savage of the University of Washington, who is an expert in lattice QCD, which requires mind-boggling amounts of computer power. He pointed out that the QCD community already relies on large computing infrastructures that are created and maintained by both physicists and computing experts. A similar technological and human infrastructure, he believes, must be created for lattice QCD quantum computing.
Solving the “sign problem”
Quantum computers could play crucial roles in solving the “sign problem” in lattice QCD, which makes calculations increasingly difficult as the number of particles increases. They could also be used to calculate the dynamical evolution of a system, charting particle interactions in a collider, for example.
The IBM and Rigetti quantum computers used to perform the first nuclear calculation had 16 and 19 qubits respectively, so my jaw dropped when Savage said that about 4 million qubits would be needed to do a lattice QCD better than state-of-the-art conventional computers. And my jaw dropped even further when he said that experts in the industry didn’t seem to think this would be a problem!