ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2022
DOI: 10.1109/icassp43922.2022.9747369
|View full text |Cite
|
Sign up to set email alerts
|

Quantum Long Short-Term Memory

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
38
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 55 publications
(39 citation statements)
references
References 27 publications
1
38
0
Order By: Relevance
“…3 one can see that the quantum circuit minimizes loss significantly faster in the first few epochs in comparison to the classical network. The loss curve is also more smooth in the quantum case, as was also noticed in [2]. For bigger networks (about 30 parameters), we observe that the loss at the end of the training is lower than for the smaller network, demonstrating that the networks predictions improve with increasing numbers of qmodes and parameters.…”
Section: Resultssupporting
confidence: 76%
See 1 more Smart Citation
“…3 one can see that the quantum circuit minimizes loss significantly faster in the first few epochs in comparison to the classical network. The loss curve is also more smooth in the quantum case, as was also noticed in [2]. For bigger networks (about 30 parameters), we observe that the loss at the end of the training is lower than for the smaller network, demonstrating that the networks predictions improve with increasing numbers of qmodes and parameters.…”
Section: Resultssupporting
confidence: 76%
“…Recent papers show benefits of using this approach for satellite image classification [6] or modeling joint probability distributions [8]. QML was also used to analyse time series [1,7,2]. In this paper we propose new approach to the task of time series prediction using Quantum Recurrent Neural Networks in continuous variable paradigm.…”
Section: Introductionmentioning
confidence: 99%
“…VQC Circuit Ansatz, G(θ g ). Mainstream QNNs [13,16,17] adopt a VQC ansatz constructed by parameterized singlequbit rotation gates followed by nearest-neighbor coupling of qubits using fixed two-qubit CNOT gates, as illustrated in Figure 1. Such a circuit ansatz has demonstrated superior expressive capability in various applications.…”
Section: Quantum Gans Backgroundmentioning
confidence: 99%
“…It is clear that the same fidelity and hence the same loss can be achieved with different parameter sets due to the periodic repeatability of the sine and cosine functions, resulting in distorted generated images (See Figure 2). Although input data can be mapped to the range [0,π] before applying the angle encoding [8,16], such unfaithful normalization prevents a QNN from learning an accurate model and producing high-quality output. We therefore conclude that the de facto angle encoding (even with normalization) fails to ensure highquality output in a fidelity based GAN framework.…”
Section: Preliminary Analysismentioning
confidence: 99%
“…It is a model that is capable of inferencing fully on quantum computers. On the contrary, Quantum Long short-term memory (Chen, Yoo, and Fang 2022) was also proposed. However, in the QLSTM, quantum computing is only used to enhance the input data by transforming data via Variational Quantum Eigensolver.…”
Section: Quantum Neural Network For Sequential Inputmentioning
confidence: 99%