2021
DOI: 10.48550/arxiv.2111.04695
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

ORQVIZ: Visualizing High-Dimensional Landscapes in Variational Quantum Algorithms

Abstract: Variational Quantum Algorithms (VQAs) are promising candidates for finding practical applications of near to mid-term quantum computers. There has been an increasing effort to study the intricacies of VQAs, such as the presence or absence of barren plateaus and the design of good quantum circuit ansätze. Many of these studies can be linked to the loss landscape that is optimized as part of the algorithm, and there is high demand for quality software tools for flexibly studying these loss landscapes. In our wor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(10 citation statements)
references
References 49 publications
0
10
0
Order By: Relevance
“…After dimension reduction, we choose the obtained first two principal components that explain most of variance as the landscape spanning vector. Refer to [71] and Appendix C for details. For HAA and HEA, the objective function is governed by both the 0-th component (98.75% of variance for HAA, 98.55% of variance for HEA) and 1-th component (1.24% of variance for HAA, 1.45% of variance for HEA).…”
Section: Detailsmentioning
confidence: 99%
See 1 more Smart Citation
“…After dimension reduction, we choose the obtained first two principal components that explain most of variance as the landscape spanning vector. Refer to [71] and Appendix C for details. For HAA and HEA, the objective function is governed by both the 0-th component (98.75% of variance for HAA, 98.55% of variance for HEA) and 1-th component (1.24% of variance for HAA, 1.45% of variance for HEA).…”
Section: Detailsmentioning
confidence: 99%
“…Simultaneously, the projection vector e i of each component indicates the contribution of each parameter to this component, implying how many parameters determine the value of objective function. Refer to [71] for details.…”
Section: Pca Used In Visualization Of Loss Landscapementioning
confidence: 99%
“…Note that the filter-wise normalization technique of [24] is not implemented here, given that the RNN ansatz used is not scale-invariant due to the presence of ELU activation functions. A principal component analysis approach, similar to the one used in the visualization of variational quantum circuits landscapes could also be implemented [26]. However, we found the previously described visualization method sufficient to interpret our results.…”
Section: Loss Landscape Visualizationmentioning
confidence: 99%
“…In particular, it was used to understand why skip-connections in residual neural networks are generalizing better than vanilla convolutional neural networks. Nowadays, in the quantum computing community, it is more and more employed to benchmark different quantum circuits architectures over a variety of tasks such as quantum optimization or quantum machine learning [25,26]. This paper uses it to study trainability in the neural annealing paradigm.…”
Section: Introductionmentioning
confidence: 99%
“…In particular, it was used to understand why skip connections in residual neural networks are generalizing better than vanilla convolutional neural networks. Nowadays, in the quantum computing community, it is more and more employed to benchmark different quantum circuit architectures over a variety of tasks, such as quantum optimization or quantum machine learning [25,26]. This paper uses it to study trainability in the neural annealing paradigm.…”
Section: Introductionmentioning
confidence: 99%