2021
DOI: 10.1103/physreva.104.012401
|View full text |Cite
|
Sign up to set email alerts
|

U(1)-symmetric recurrent neural networks for quantum state reconstruction

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 25 publications
(6 citation statements)
references
References 33 publications
0
6
0
Order By: Relevance
“…more possible reference bases in the first step. Our active learning scheme can furthermore be generalized to state representations other than the restricted Boltzmann machines considered here, such as variational autoencoders [21], recurrent [22] and convolutional neural networks [23], generative adversarial networks [24], and transformer architectures [25].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…more possible reference bases in the first step. Our active learning scheme can furthermore be generalized to state representations other than the restricted Boltzmann machines considered here, such as variational autoencoders [21], recurrent [22] and convolutional neural networks [23], generative adversarial networks [24], and transformer architectures [25].…”
Section: Discussionmentioning
confidence: 99%
“…Here, we use the implementation of RBM quantum state tomography by Beach et al [20] in the form of a python package called QuCumber. However, the idea of our AL scheme is independent of the specific implementation of the RBMs and can in principle be applied in combination with any other quantum state tomography scheme, such as other neural network architectures [21][22][23][24][25] as well as matrix-product state based state reconstruction [26,27]. Here, the target many-body quantum state is represented in terms of an artificial neu-ral network (RBM) wave function…”
Section: Restricted Boltzmann Machinesmentioning
confidence: 99%
“…Despite the encouraging results obtained from our quantum-based models, we foresee a significant space for potential improvements regarding all the generative models used in this study and some not explored here. In particular, one can embed constraints into generative models such as in U (1)-symmetric tensor networks [39] and U (1)-symmetric RNNs [41,42]. Furthermore, including other state-of-theart generative models with different variations is vital for establishing a more comprehensive comparison.…”
Section: Conclusion and Outlooksmentioning
confidence: 99%
“…Recently, NNs have been used to perform QST [11,[80][81][82]. In particular, a wide class of states can be modeled by a so-called Restricted Boltzmann Machine (RBM) (Box B and Fig.…”
Section: [N]mentioning
confidence: 99%