2022
DOI: 10.1016/j.cartre.2022.100231
|View full text |Cite
|
Sign up to set email alerts
|

A perspective on machine learning and data science for strongly correlated electron problems

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 185 publications
0
2
0
Order By: Relevance
“…This design allows users to implement straightforward parallelization at the script level, adapt the package to their existing workflows, and more readily interface SmoQyDQMC.jl with the rich ecosystem of scientific computing packages being actively developed in the Julia programming language. For example, it can be readily coupled to existing machine learning and artificial intelligence packages to enable new research in this direction [94].…”
Section: Background and Motivationmentioning
confidence: 99%
“…This design allows users to implement straightforward parallelization at the script level, adapt the package to their existing workflows, and more readily interface SmoQyDQMC.jl with the rich ecosystem of scientific computing packages being actively developed in the Julia programming language. For example, it can be readily coupled to existing machine learning and artificial intelligence packages to enable new research in this direction [94].…”
Section: Background and Motivationmentioning
confidence: 99%
“…The focus is on 2D magnetic systems that often contain transition metals, which belong to strongly correlated electron systems. [ 89 ] To ensure accuracy in the calculations, it is important to take into account the role of the Hubbard parameter U , which represents the Coulomb repulsion arising from strong electron correlations. [ 90 ] Among them, the U value of the transition metal can be determined by comparing experimental values of bandgaps, structural parameters, and magnetic moments.…”
Section: Framework Of Htp Screening and Adaptive MLmentioning
confidence: 99%
“…These patches are subsequently linearly transformed, i.e., d−dimensional representations of the input patches, called tokens, are computed. 6 Importantly, as transformers do not sequentially process the input, the tokens are further positionally encoded, i.e., the position of the patch within the original image is stored. Thereafter follows the self-attention encoder, where all-to-all inter-dependencies between tokens are computed.…”
Section: Transformer Encodermentioning
confidence: 99%
“…Next to revolutionizing applications in image and sequence processing, in recent years neural networks have gained tremendous interest also in the field of quantum many-body physics [4][5][6][7]. In strongly correlated systems, complex phases of matter can emerge in seemingly simple models -which, in many settings, still lack microscopic understanding [8,9].…”
Section: Introductionmentioning
confidence: 99%