A quantum algorithm to train neural networks using low-depth circuits

Dec 14, 2017
e-Print:

Citations per year

20182020202220242025051015
Abstract: (submitter)
Can near-term gate model based quantum processors offer quantum advantage for practical applications in the pre-fault tolerance noise regime? A class of algorithms which have shown some promise in this regard are the so-called classical-quantum hybrid variational algorithms. Here we develop a low-depth quantum algorithm to generative neural networks using variational quantum circuits. We introduce a method which employs the quantum approximate optimization algorithm as a subroutine in order produce then sample low-energy distributions of Ising Hamiltonians. We sample these states to train neural networks and demonstrate training convergence for numerically simulated noisy circuits with depolarizing errors of rates of up to 4%4\%.
  • quantum circuit: variational
  • neural network
  • quantum algorithm
  • noise
  • Hamiltonian
  • hybrid
  • quantum advantage
  • quantum approximate optimization algorithm
  • gate