2020
DOI: 10.1103/physrevd.102.054501
|View full text |Cite
|
Sign up to set email alerts
|

Topological defects and confinement with machine learning: The case of monopoles in compact electrodynamics

Abstract: We investigate the advantages of machine learning techniques to recognize the dynamics of topological objects in quantum field theories. We consider the compact U(1) gauge theory in three spacetime dimensions as the simplest example of a theory that exhibits confinement and mass gap phenomena generated by monopoles. We train a neural network with a generated set of monopole configurations to distinguish between confinement and deconfinement phases, from which it is possible to determine the deconfinement trans… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
12
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(12 citation statements)
references
References 32 publications
(58 reference statements)
0
12
0
Order By: Relevance
“…Within these developments machine learning has critically influenced the domain of statistical mechanics, particularly in the study of phase transitions [3,4]. A wide range of machine learning techniques, including neural networks [5][6][7][8][9][10][11][12][13][14][15][16][17], diffusion maps [18], support vector machines [19][20][21][22] and principal component analysis [23][24][25][26][27] have been implemented to study equilibrium and non-equilibrium systems. Transferable features have also been explored in phase transitions, including modified models through a change of lattice topology [3] or form of interaction [28], in Potts models with a varying odd number of states [29], in the Hubbard model [6], in fermions [5], in the neural network-quantum states ansatz [30,31] and in adversarial domain adaptation [32].…”
Section: Introductionmentioning
confidence: 99%
“…Within these developments machine learning has critically influenced the domain of statistical mechanics, particularly in the study of phase transitions [3,4]. A wide range of machine learning techniques, including neural networks [5][6][7][8][9][10][11][12][13][14][15][16][17], diffusion maps [18], support vector machines [19][20][21][22] and principal component analysis [23][24][25][26][27] have been implemented to study equilibrium and non-equilibrium systems. Transferable features have also been explored in phase transitions, including modified models through a change of lattice topology [3] or form of interaction [28], in Potts models with a varying odd number of states [29], in the Hubbard model [6], in fermions [5], in the neural network-quantum states ansatz [30,31] and in adversarial domain adaptation [32].…”
Section: Introductionmentioning
confidence: 99%
“…The confusion of the machine-learning algorithm may be quantified via a specific ML variable and may therefore serve as an ML-based order parameter used to determine the location of a phase transition [17]. This criterion, applied to Abelian monopoles, gives a good prediction of a thermodynamically smooth deconfinement phase transition of the Berezinskii-Kosterlitz-Thouless type in a low-dimensional model that exhibits the confinement phenomenon [13].…”
Section: Machine Learning Of Lattice Fieldsmentioning
confidence: 99%
“…[7]). Field configurations of the quantum field theory, viewed as statistical ensembles in the thermodynamic limit, are well-suited for the application of machine-learning techniques, as it was demonstrated in a number of recent works [7][8][9][10][11][12][13].…”
Section: Introductionmentioning
confidence: 99%
“…Recently, applications of deep learning [7], a class of machine learning algorithms which are able to hierarchically extract abstract features in data, have emerged in the physical sciences [8], including in lattice field theories [9][10][11][12][13][14][15][16] and in the study of phase transitions [17][18][19][20][21][22][23]. Insights on machine learning algorithms have been obtained from the perspective of statistical physics [24][25][26][27][28][29][30][31][32], particularly within the theory of spin glasses [33], or in relation to Gaussian processes [34][35][36][37][38].…”
Section: Introductionmentioning
confidence: 99%