2022
DOI: 10.22331/q-2022-07-07-759
|View full text |Cite
|
Sign up to set email alerts
|

The Quantum Approximate Optimization Algorithm and the Sherrington-Kirkpatrick Model at Infinite Size

Abstract: The Quantum Approximate Optimization Algorithm (QAOA) is a general-purpose algorithm for combinatorial optimization problems whose performance can only improve with the number of layers p. While QAOA holds promise as an algorithm that can be run on near-term quantum computers, its computational power has not been fully explored. In this work, we study the QAOA applied to the Sherrington-Kirkpatrick (SK) model, which can be understood as energy minimization of n spins with all-to-all random signed couplings. Th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
162
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 130 publications
(170 citation statements)
references
References 17 publications
1
162
0
Order By: Relevance
“…Although VQASVM have freedom to choose upto O(M ) number of parameters using hardware-efficient ansatz, we can choose a PQC design with O(polylog(M )) parameters at the cost of accuracy or training loss performance degradation. 37 Taking this as an assumption with a given accuracy bound of ǫ, the training time complexity of VQASVM drops to O(M N polylog(M )/ǫ), suggesting potential asymptotic speed-up compared to SVM. Our numerical experiments with various PQC designs 35 show an interesting observation that the number of parameters can be smaller than the number of training data as shown in Figs.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Although VQASVM have freedom to choose upto O(M ) number of parameters using hardware-efficient ansatz, we can choose a PQC design with O(polylog(M )) parameters at the cost of accuracy or training loss performance degradation. 37 Taking this as an assumption with a given accuracy bound of ǫ, the training time complexity of VQASVM drops to O(M N polylog(M )/ǫ), suggesting potential asymptotic speed-up compared to SVM. Our numerical experiments with various PQC designs 35 show an interesting observation that the number of parameters can be smaller than the number of training data as shown in Figs.…”
Section: Discussionmentioning
confidence: 99%
“…We propose a Variational Quantum Approximate Support Vector Machine (VQASVM) algorithm inspired by quantum approximate optimization algorithm that can reduce the number of parameters PQC designs for certain problems. 37 Fig. 2a summarizes the process of VQASVM.…”
Section: Variational Quantum Approximate Support Vector Machinementioning
confidence: 99%
“…There has been some study of a near-term quantum algorithm (the QAOA [FGG14]) optimizing spin glasses, with recently-proven rigorous performance bounds [CvD21,FGGZ22]. In fact, for large enough clause density, the performance of the QAOA is identical on a random instance of Max-kXOR and on its corresponding spin glass [BM21, BFM + 21, BGMZ22].…”
Section: Related Workmentioning
confidence: 99%
“…As the development of quantum computers evolves significantly [ 1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 , 9 , 10 , 11 , 12 , 13 , 14 , 15 , 16 , 17 , 18 , 19 , 20 , 21 , 22 , 23 , 24 , 25 , 26 , 27 ], a fundamental need to characterize the attributes of problem solving in quantum computers has arisen. Gate-model quantum computers have particular relevance since most of these architectures allow for practical solutions to be implemented on near-term settings.…”
Section: Introductionmentioning
confidence: 99%
“…In a gate-model quantum computer, the computational steps are realized via unitary gates. The gates are associated with a gate parameter value, while the computational problem fed into the quantum computer identifies an objective function [ 6 , 7 , 8 , 9 , 10 , 29 ] (objective function examples can be found in [ 7 , 9 , 10 , 11 , 14 , 15 ]). The aim of problem solving is to maximize the objective function value via several iteration steps.…”
Section: Introductionmentioning
confidence: 99%