2023
DOI: 10.1038/s41467-023-36159-y
|View full text |Cite
|
Sign up to set email alerts
|

Quantum machine learning beyond kernel methods

Abstract: Machine learning algorithms based on parametrized quantum circuits are prime candidates for near-term applications on noisy quantum computers. In this direction, various types of quantum machine learning models have been introduced and studied extensively. Yet, our understanding of how these models compare, both mutually and to classical models, remains limited. In this work, we identify a constructive framework that captures all standard models based on parametrized quantum circuits: that of linear quantum mo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
25
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 69 publications
(38 citation statements)
references
References 37 publications
0
25
0
Order By: Relevance
“…The QGAIL with amplitude encoding has a logarithmic scaling but there is no known efficient algorithm to encode the arbitrary classical data into the quantum memory in a superposition way [51]. The parameter complexity advantage in VQC-based supervised learning is also empirically observed in [52][53][54][55]. These quantum algorithms demonstrates the parameter complexity advantage of VQC in QML for classical tasks.…”
Section: Qgail For Quantum Controlmentioning
confidence: 94%
“…The QGAIL with amplitude encoding has a logarithmic scaling but there is no known efficient algorithm to encode the arbitrary classical data into the quantum memory in a superposition way [51]. The parameter complexity advantage in VQC-based supervised learning is also empirically observed in [52][53][54][55]. These quantum algorithms demonstrates the parameter complexity advantage of VQC in QML for classical tasks.…”
Section: Qgail For Quantum Controlmentioning
confidence: 94%
“…Like other quantum machine learning models (Lloyd et al, 2020;Jerbi et al, 2021), QNLP models encode words to different qubits, and design the quantum circuit routine (a.k.a ansatz) to derive the sentence representation before feeding it to a measurement layer. In its mathematical form, the model encodes each word to a unit complex vector and it has an all-linear structure up to the measurement outcome.…”
Section: Complex-valued Neural Networkmentioning
confidence: 99%
“…However, due to this quantum feature mapping, the interpretation of the vector quantization algorithm with respect to the original data space may be limited, whereas, within the Bloch sphere (Hilbert space), the prototype principle and interpretation paradigms remain true. Thereby, the mapping here is analogous to the kernel feature mapping in support vector machines [ 38 ] as pointed out frequently [ 85 , 86 , 87 ].…”
Section: Quantum Approaches For Vector Quantizationmentioning
confidence: 99%