2021
DOI: 10.22331/q-2021-08-30-531
|View full text |Cite
|
Sign up to set email alerts
|

Towards understanding the power of quantum kernels in the NISQ era

Abstract: A key problem in the field of quantum computing is understanding whether quantum machine learning (QML) models implemented on noisy intermediate-scale quantum (NISQ) machines can achieve quantum advantages. Recently, Huang et al. [Nat Commun 12, 2631] partially answered this question by the lens of quantum kernel learning. Namely, they exhibited that quantum kernels can learn specific datasets with lower generalization error over the optimal classical kernel methods. However, most of their results are establis… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
41
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 41 publications
(42 citation statements)
references
References 32 publications
1
41
0
Order By: Relevance
“…Many of the quantum advantages associated with near term variational QML algorithms relate to model capacity, expressivity, and sample efficiency. In particular, variational QML algorithms may yield reductions in the number of required trainable parameters [238], generalization error [226,227,228,229], the number of examples required to learn a model [236,222], and improvements in training landscapes [277,226,227,231,232]. Evidence supporting one or more of these advantages has been found in both theoretical models and proof of principle implementations of quantum neural networks (QNNs) [226,227,228,231] and quantum kernel methods (QKM) [222,229,225].…”
Section: Quantum Machine Learningmentioning
confidence: 99%
See 3 more Smart Citations
“…Many of the quantum advantages associated with near term variational QML algorithms relate to model capacity, expressivity, and sample efficiency. In particular, variational QML algorithms may yield reductions in the number of required trainable parameters [238], generalization error [226,227,228,229], the number of examples required to learn a model [236,222], and improvements in training landscapes [277,226,227,231,232]. Evidence supporting one or more of these advantages has been found in both theoretical models and proof of principle implementations of quantum neural networks (QNNs) [226,227,228,231] and quantum kernel methods (QKM) [222,229,225].…”
Section: Quantum Machine Learningmentioning
confidence: 99%
“…In particular, variational QML algorithms may yield reductions in the number of required trainable parameters [238], generalization error [226,227,228,229], the number of examples required to learn a model [236,222], and improvements in training landscapes [277,226,227,231,232]. Evidence supporting one or more of these advantages has been found in both theoretical models and proof of principle implementations of quantum neural networks (QNNs) [226,227,228,231] and quantum kernel methods (QKM) [222,229,225]. It is notable that these methods are both closely related to VQAs leveraging gradient-based classical optimizers (indeed, they often share overlapping definitions in the literature, as briefly noted in 2019 [213]) [223,411,412].…”
Section: Quantum Machine Learningmentioning
confidence: 99%
See 2 more Smart Citations
“…Concretely, for the synthetic dataset, Refs. [53][54][55] exhibited the advantages of quantum neural networks [56,57] and quantum kernels [56] in the measure of generalization error [58][59][60][61][62][63]. However, quantum supervised and unsupervised learning models may encounter trainability issues, where the gradients exponentially vanish for the number of qubits [64,65].…”
Section: Introductionmentioning
confidence: 99%