2020
DOI: 10.1103/physreve.101.023304
|View full text |Cite
|
Sign up to set email alerts
|

Asymptotically unbiased estimation of physical observables with neural samplers

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
103
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 85 publications
(116 citation statements)
references
References 18 publications
1
103
1
Order By: Relevance
“…Within these developments machine learning has critically influenced the domain of statistical mechanics, particularly in the study of phase transitions [3,4]. A wide range of machine learning techniques, including neural networks [5][6][7][8][9][10][11][12][13][14][15][16][17], diffusion maps [18], support vector machines [19][20][21][22] and principal component analysis [23][24][25][26][27] have been implemented to study equilibrium and non-equilibrium systems. Transferable features have also been explored in phase transitions, including modified models through a change of lattice topology [3] or form of interaction [28], in Potts models with a varying odd number of states [29], in the Hubbard model [6], in fermions [5], in the neural network-quantum states ansatz [30,31] and in adversarial domain adaptation [32].…”
Section: Introductionmentioning
confidence: 99%
“…Within these developments machine learning has critically influenced the domain of statistical mechanics, particularly in the study of phase transitions [3,4]. A wide range of machine learning techniques, including neural networks [5][6][7][8][9][10][11][12][13][14][15][16][17], diffusion maps [18], support vector machines [19][20][21][22] and principal component analysis [23][24][25][26][27] have been implemented to study equilibrium and non-equilibrium systems. Transferable features have also been explored in phase transitions, including modified models through a change of lattice topology [3] or form of interaction [28], in Potts models with a varying odd number of states [29], in the Hubbard model [6], in fermions [5], in the neural network-quantum states ansatz [30,31] and in adversarial domain adaptation [32].…”
Section: Introductionmentioning
confidence: 99%
“…The code for the simulation is written in Python using the NumPy library [38][39][40][41]. We note that using generative neural network, instead of Monte Carlo method, for sampling has also been proposed recently for the 1D quantum spin models [42,43].…”
Section: Methodology a Monte Carlo Samplingmentioning
confidence: 99%
“…Likewise, Ref. [193] explores a general framework for the estimation of observables with generative neural samplers focusing on autoregressive networks and normalizing flows that provide an exact sampling probability. These examples anticipate that neural autoregressive models have potential applications in important combinatorial optimization and constraint satisfaction problems, where finding the optimal configurations corresponds to finding ground states of glassy problems, and counting the number of solutions is equivalent to estimating the zero-temperature entropy of the system.…”
Section: Machine Learning Acceleration Of Monte Carlo Simulationsmentioning
confidence: 99%