2023
DOI: 10.1038/s41534-022-00672-7
|View full text |Cite
|
Sign up to set email alerts
|

Theoretical error performance analysis for variational quantum circuit based functional regression

Abstract: The noisy intermediate-scale quantum devices enable the implementation of the variational quantum circuit (VQC) for quantum neural networks (QNN). Although the VQC-based QNN has succeeded in many machine learning tasks, the representation and generalization powers of VQC still require further investigation, particularly when the dimensionality of classical inputs is concerned. In this work, we first put forth an end-to-end QNN, TTN-VQC, which consists of a quantum tensor network based on a tensor-train network… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 29 publications
(13 citation statements)
references
References 41 publications
0
10
0
Order By: Relevance
“…Mean square error (MSE) is an error metric that provides the mean of squared differences between predictions, and the real values of labels and root mean squared error (RMSE) is the second root of MSE (Das, 2004). Mean absolute error is another evaluation metric defined by the mean of absolute differences between predictions and real values of labels (Qi, 2022). In all error metrics, instead of Rsquared, the optimal level of fit occurs when they are close to zero.…”
Section: Graph Convolutional Network (Gcns)mentioning
confidence: 99%
“…Mean square error (MSE) is an error metric that provides the mean of squared differences between predictions, and the real values of labels and root mean squared error (RMSE) is the second root of MSE (Das, 2004). Mean absolute error is another evaluation metric defined by the mean of absolute differences between predictions and real values of labels (Qi, 2022). In all error metrics, instead of Rsquared, the optimal level of fit occurs when they are close to zero.…”
Section: Graph Convolutional Network (Gcns)mentioning
confidence: 99%
“…Some proposals even consider employing TNs for optimizing parameters or hyperparameters of QML algorithms [96]. For specific implementations, first evidence exists that the locality of TNs can overcome barren plateaus [97,98]. Especially the use of local loss functions, which can be implemented using local Hamiltonians, provides a favourable loss landscape without gradients vanishing exponentially fast [92,99,100].…”
Section: Quantum Tensor Network Machine Learningmentioning
confidence: 99%
“…Trainable TN encoding using a latent space representation from the outgoing bond dimensions usually is optimized together with the parameters of the QML circuit [40,114]. A theoretical study on the error performance of function regression models finds upper bounds when certain continuity requirements on the loss and the network are met [98]. Particularly, they find that the optimization error connected to barren plateaus will be negligible if the loss on the TN parameters is Lipschitz and satisfies a Polyak-Lojasiewicz condition.…”
Section: Tensor Network For Data Encodingmentioning
confidence: 99%
“…To address the above issues, in our previous work [37,38], we put forth a TTN-VQC architecture that is composed of a tensor-train network (TTN) [39] with a VQC. The TTN introduces non-linearity to the quantum input features to enhance the VQC representation power, and many works [37,40,41] have demonstrated that TTN introduces an inductive bias such that it can assist the VQC in overcoming the Barren Plateau problem in the training process.…”
Section: Introductionmentioning
confidence: 99%
“…To address the above issues, in our previous work [37,38], we put forth a TTN-VQC architecture that is composed of a tensor-train network (TTN) [39] with a VQC. The TTN introduces non-linearity to the quantum input features to enhance the VQC representation power, and many works [37,40,41] have demonstrated that TTN introduces an inductive bias such that it can assist the VQC in overcoming the Barren Plateau problem in the training process. Since the TTN is a classical simulation of quantum circuits that could be implemented using one and two-qubit quantum circuits [42], we claim that the TTN is associated with a quantum tensor network (QTN) and simulates the wave function of the corresponding quantum circuits classically.…”
Section: Introductionmentioning
confidence: 99%