2019
DOI: 10.1103/physrevlett.122.080602
|View full text |Cite
|
Sign up to set email alerts
|

Solving Statistical Mechanics Using Variational Autoregressive Networks

Abstract: We propose a general framework for solving statistical mechanics of systems with finite size. The approach extends the celebrated variational mean-field approaches using autoregressive neural networks, which support direct sampling and exact calculation of normalized probability of configurations. It computes variational free energy, estimates physical quantities such as entropy, magnetizations and correlations, and generates uncorrelated samples all at once. Training of the network employs the policy gradient… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

4
239
0
2

Year Published

2019
2019
2022
2022

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 191 publications
(258 citation statements)
references
References 37 publications
4
239
0
2
Order By: Relevance
“…Recently, there has been progress in the development of flow-based generative models which can be trained to directly produce samples from a given probability distribution; early success has been demonstrated in theories of bosonic matter, spin systems, molecular systems, and for Brownian motion [24][25][26][27][28][29][30][31][32][33][34]. This progress builds on the great success of flow-based approaches for image, text, and structured object generation [35][36][37][38][39][40][41][42], as well as non-flow-based machine learning techniques applied to sampling for physics [43][44][45][46][47][48]. If flow-based algorithms can be designed and implemented at the scale of state-of-the-art calculations, they would enable efficient sampling in lattice theories that are currently hindered by CSD.…”
mentioning
confidence: 99%
“…Recently, there has been progress in the development of flow-based generative models which can be trained to directly produce samples from a given probability distribution; early success has been demonstrated in theories of bosonic matter, spin systems, molecular systems, and for Brownian motion [24][25][26][27][28][29][30][31][32][33][34]. This progress builds on the great success of flow-based approaches for image, text, and structured object generation [35][36][37][38][39][40][41][42], as well as non-flow-based machine learning techniques applied to sampling for physics [43][44][45][46][47][48]. If flow-based algorithms can be designed and implemented at the scale of state-of-the-art calculations, they would enable efficient sampling in lattice theories that are currently hindered by CSD.…”
mentioning
confidence: 99%
“…The network architecture draws upon successful autoregressive models for representing and sampling from probability distributions. Those are widely employed in the machine learning literature [15], and have been recently used for statistical mechanics applications [16], as well as density matrix reconstructions from experimental quantum systems [17]. We generalize these autoregressive models to treat complex-valued wave-functions, obtaining highly expressive architectures parametrizing an automatically normalized many-body quantum wave-function.…”
mentioning
confidence: 99%
“…Restricted Boltzmann machine has been successfully applied as an ansatz to a ground state search, dynamics calculation, and quantum tomography [ 58 , 59 , 60 ], as well as convolution neural network to the two-dimensional frustrated model [ 61 ]. The deep autoregressive model was applied very efficiently and elegantly to a ground state search of many-body quantum system and to classical statistical physics as well [ 62 , 63 ]. It was also recently shown how ML can establish and classify with high accuracy the chaotic or regular behavior of quantum billiards models and XXZ spin chains [ 64 ].…”
Section: Introductionmentioning
confidence: 99%