2018
DOI: 10.1103/physrevx.8.031012
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Generative Modeling Using Matrix Product States

Abstract: Generative modeling, which learns joint probability distribution from data and generates samples according to it, is an important task in machine learning and artificial intelligence. Inspired by probabilistic interpretation of quantum physics, we propose a generative model using matrix product states, which is a tensor network originally proposed for describing (particularly one-dimensional) entangled quantum states. Our model enjoys efficient learning analogous to the density matrix renormalization group met… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

1
354
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 248 publications
(355 citation statements)
references
References 49 publications
1
354
0
Order By: Relevance
“…1 and 2 of the main text). The estimations for random measurements were obtained using a sampling algorithm of the occupation probabilities P U (s) for Matrix-Product-States [35].…”
Section: Appendix E: Adiabatic State Preparationmentioning
confidence: 99%
“…1 and 2 of the main text). The estimations for random measurements were obtained using a sampling algorithm of the occupation probabilities P U (s) for Matrix-Product-States [35].…”
Section: Appendix E: Adiabatic State Preparationmentioning
confidence: 99%
“…As one of the most powerful numerical tools for studying quantum manybody systems [6][7][8][9], tensor networks (TNs) have drawn more attention. For instance, TNs have been recently applied to solve machine learning problems such as dimensionality reduction [10,11] and handwriting recognition [12,13]. Just as a TN allows the numerical treatment of difficult physical systems by providing layers of abstraction, deep learning achieved similar striking advances in automated feature extraction and pattern recognition using a hierarchical representation [14].…”
Section: Introductionmentioning
confidence: 99%
“…A TTN of hierarchical structure [29] suits the two-dimensional (2D) nature of images more than those based on a one-dimensional (1D) TN, e.g. matrix product state (MPS) [12,13,30]. Secondly, we explicitly connects machine learning to quantum quantities, such as fidelity and entanglement.…”
Section: Introductionmentioning
confidence: 99%
“…MPS can efficiently describe the ground states and (purified) thermal states of one-dimensional (1D) gapped systems [8][9][10][11][12]. It has also been widely and successfully applied to other areas including statistic physics [13], non-equilibrium quantum physics [9, 14-17], field theories [18][19][20][21][22][23], machine learning [24][25][26][27][28], and so on.In particular, MPS is an important model in quantum information and computation (see, e.g., [29][30][31][32]). It can represent a large class of states, including GHZ [33] and AKLT states [34,35], which can implement non-trivial quantum computational tasks [36,37].…”
mentioning
confidence: 99%
“…MPS can efficiently describe the ground states and (purified) thermal states of one-dimensional (1D) gapped systems [8][9][10][11][12]. It has also been widely and successfully applied to other areas including statistic physics [13], non-equilibrium quantum physics [9,[14][15][16][17], field theories [18][19][20][21][22][23], machine learning [24][25][26][27][28], and so on.…”
mentioning
confidence: 99%