2019
DOI: 10.3390/e21121236
|View full text |Cite
|
Sign up to set email alerts
|

Probabilistic Modeling with Matrix Product States

Abstract: Inspired by the possibility that generative models based on quantum circuits can provide a useful inductive bias for sequence modeling tasks, we propose an efficient training algorithm for a subset of classically simulable quantum circuit models. The gradient-free algorithm, presented as a sequence of exactly solvable effective models, is a modification of the density matrix renormalization group procedure adapted for learning a probability distribution. The conclusion that circuit-based models offer a useful … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 30 publications
(24 citation statements)
references
References 17 publications
0
24
0
Order By: Relevance
“…Typical neural network architectures do not involve sophisticated linear algebra operations. However, with the development of tensorized neural networks [101] and applications of various tensor networks to machine learning problems [10][11][12][13][14]102], the boundary between the two classes of networks is blurred. Thus, results presented this paper would also be relevant to tensor network machine learning applications when one moves to more sophisticated contraction schemes.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Typical neural network architectures do not involve sophisticated linear algebra operations. However, with the development of tensorized neural networks [101] and applications of various tensor networks to machine learning problems [10][11][12][13][14]102], the boundary between the two classes of networks is blurred. Thus, results presented this paper would also be relevant to tensor network machine learning applications when one moves to more sophisticated contraction schemes.…”
Section: Discussionmentioning
confidence: 99%
“…Tensor networks are prominent approaches for studying classical statistical physics and quantum many-body physics problems [1][2][3]. In recent years, its application has expanded rapidly to diverse regions include simulating and designing of quantum circuits [4][5][6][7], quantum error correction [8,9], machine learning [10][11][12][13][14], language modeling [15,16], quantum field theory [17][18][19][20] and holography duality [21,22].…”
Section: Introductionmentioning
confidence: 99%
“…Once the TNR has been constructed and optimized, the scaling dimensions can also be extracted easily. Here we use the transfer matrix technique [69] to compute the scaling dimensions with the critical inverse temperature β c = ln (1 + √ 2)/2 shown in figure 19. We can see that with the TNR and transfer matrix technique the computed scaling dimensions have accurate results even at quite higher orders.…”
Section: Scaling Dimensionsmentioning
confidence: 96%
“…Tensor network has been a powerful tool to study quantum many-body systems and classical statistical-mechanical models both theoretically [1,2] and numerically [3][4][5][6][7][8]. And recently, it has been proposed as an alternative tool for (quantum) machine learning tasks, both for supervised learning [9][10][11][12][13][14][15][16], and unsupervised learning [17][18][19][20][21][22]. In the family of tensor networks, the multi-scale entanglement renormalization ansatz (MERA) and tensor network renormalization (TNR) are important tensor networks which are inspired by the idea of renormalization group (RG).…”
Section: Introductionmentioning
confidence: 99%
“…Tensor networks have been successfully applied to several learning tasks including dimensionality reduction [250], unsupervised learning and generative modelling using matrix product states [251][252][253], representation learning with multi-scale tensor networks [254], sequence-to-sequence learning using matrix product operators [255], language modelling [256,257], Bayesian inference [258]. Ref.…”
Section: Quantum Physics-inspired Machine Learningmentioning
confidence: 99%