Review on Quantum Computing for Lattice Field Theory

Feb 1, 2023
25 pages
Published in:
  • PoS LATTICE2022 (2023) 228
Contribution to:
  • Published: Feb 1, 2023
e-Print:
Report number:
  • MIT-CTP/5482

Citations per year

20232024202520155
Abstract: (SISSA)
In these proceedings, we review recent advances in applying quantum computing to lattice field theory. Quantum computing offers the prospect to simulate lattice field theories in parameter regimes that are largely inaccessible with the conventional Monte Carlo approach, such as the sign-problem afflicted regimes of finite baryon density, topological terms, and out-of-equilibrium dynamics. First proof-of-concept quantum computations of lattice gauge theories in (1+1) dimensions have been accomplished, and first resource-efficient quantum algorithms for lattice gauge theories in (1+1) and (2+1) dimensions have been developed. The path towards quantum computations of (3+1)-dimensional lattice gauge theories, including Lattice QCD, requires many incremental steps of improving both quantum hardware and quantum algorithms. After reviewing these requirements and recent advances, we discuss the main challenges and future directions.
Note:
  • 25 pages, 9 figures; Proceedings of the 39th International Symposium on Lattice Field Theory, 8th-13th August 2022, Rheinische Friedrich-Wilhelms-Universität Bonn, Germany
  • computer: quantum
  • baryon: density
  • lattice field theory
  • quantum algorithm
  • lattice
  • numerical calculations
  • hardware
  • Monte Carlo
  • topological