loading page

Fast batch gradient descent in quantum neural networks
  • JooYong Shim,
  • Joongheon Kim
JooYong Shim
Korea University
Author Profile
Joongheon Kim
Korea University

Corresponding Author:joongheon@korea.ac.kr

Author Profile

Abstract

We propose a novel batch gradient descent algorithm for parameterized quantum circuits that significantly reduces the time complexity in terms of batch size for training quantum neural networks. Batch data constructed to quantum random access memory (qRAM) structure is mapped to one circuit that estimates average loss. As the number of circuits decreases, the range to which quantum amplitude estimation can be applied increases, speeding up with a quadratic scale in batch size.
27 Nov 2024Submitted to Electronics Letters
02 Dec 2024Submission Checks Completed
02 Dec 2024Assigned to Editor
02 Dec 2024Review(s) Completed, Editorial Evaluation Pending
03 Dec 2024Reviewer(s) Assigned
19 Dec 2024Editorial Decision: Revise Major
21 Dec 20241st Revision Received