2021
DOI: 10.48550/arxiv.2106.04975
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The dilemma of quantum neural networks

Abstract: The core of quantum machine learning is to devise quantum models with good trainability and low generalization error bound than their classical counterparts to ensure better reliability and interpretability. Recent studies confirmed that quantum neural networks (QNNs) have the ability to achieve this goal on specific datasets. With this regard, it is of great importance to understand whether these advantages are still preserved on real-world tasks. Through systematic numerical experiments, we empirically obser… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
2

Relationship

4
3

Authors

Journals

citations
Cited by 8 publications
(9 citation statements)
references
References 66 publications
(80 reference statements)
0
9
0
Order By: Relevance
“…Huang et al (2021) proved that for quantum processes, QML provides no advantage in minimizing the average prediction error, only providing an advantage for minimizing the worst-case prediction error. Kübler et al (2021) and Qian et al (2021) showed there is little indication that QML can improve supervise learning.…”
Section: Findings and Discussionmentioning
confidence: 99%
“…Huang et al (2021) proved that for quantum processes, QML provides no advantage in minimizing the average prediction error, only providing an advantage for minimizing the worst-case prediction error. Kübler et al (2021) and Qian et al (2021) showed there is little indication that QML can improve supervise learning.…”
Section: Findings and Discussionmentioning
confidence: 99%
“…While there is substantial evidence supporting reductions in generalization error [226,227,228,229], evidence of poor generalization performance under certain constructions also exists (e.g. see this recent paper [417]). This may be partly attributable to shallower quantum circuits providing better utility bounds than deeper circuits [418], which contrasts with classical neural network intuition where increased layer depth is associated with an exponential increase in model expressiveness [419].…”
Section: Quantum Machine Learningmentioning
confidence: 98%
“…However, quantum supervised and unsupervised learning models may encounter trainability issues, where the gradients exponentially vanish for the number of qubits [64,65]. Moreover, a recent study has shown that, in fact, the performance of quantum supervised learning models on realworld datasets could be worse than that of classical learning models [66].…”
Section: Introductionmentioning
confidence: 99%