2021
DOI: 10.1038/s41467-020-20471-y
|View full text |Cite
|
Sign up to set email alerts
|

Pure non-local machine-learned density functional theory for electron correlation

Abstract: Density-functional theory (DFT) is a rigorous and (in principle) exact framework for the description of the ground state properties of atoms, molecules and solids based on their electron density. While computationally efficient density-functional approximations (DFAs) have become essential tools in computational chemistry, their (semi-)local treatment of electron correlation has a number of well-known pathologies, e.g. related to electron self-interaction. Here, we present a type of machine-learning (ML) based… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

1
53
2
2

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3

Relationship

2
7

Authors

Journals

citations
Cited by 53 publications
(58 citation statements)
references
References 80 publications
1
53
2
2
Order By: Relevance
“…this work, singlet-triplet energy splittings from MRCISD-F12+Q); this training scheme is outlined in Figure1. This centering of the loss function solely on relative energies stands in contrast to previous work in NeuralXC,19 DeepKS,21 OrbNet,20 and KDFA,22 but it has three advantages: (i) it allows benchmark results to be obtained from a variety of different sources (including experiment, which almost always yields relative energies); (iiFor optimization of parameters and hyperparameters, the 360 carbenes were split into a training set of 287 carbenes, a validation set of 37 carbenes, and a test set of 36 carbenes. the validation set.…”
contrasting
confidence: 63%
See 1 more Smart Citation
“…this work, singlet-triplet energy splittings from MRCISD-F12+Q); this training scheme is outlined in Figure1. This centering of the loss function solely on relative energies stands in contrast to previous work in NeuralXC,19 DeepKS,21 OrbNet,20 and KDFA,22 but it has three advantages: (i) it allows benchmark results to be obtained from a variety of different sources (including experiment, which almost always yields relative energies); (iiFor optimization of parameters and hyperparameters, the 360 carbenes were split into a training set of 287 carbenes, a validation set of 37 carbenes, and a test set of 36 carbenes. the validation set.…”
contrasting
confidence: 63%
“…Inspired by both the success of these methods and recent work that has used neural networks to develop density functionals for KS-DFT, [19][20][21][22] we introduce a broader class of methods named "multiconfiguration density-driven functional methods" (MC-DDFMs) which aim to correct the classical or total energy E ref of a multiconfigurational wave function method through the use of a machine-learned functional E ML :…”
Section: Introductionmentioning
confidence: 99%
“…Given a difference in energy between these states from an inexpensive reference method, ∆E ref , we train functionals to minimize the mean squared deviation between the corrected energy difference, ∆E ref + ∆E target , and a target energy difference, ∆E target (in this work, singlet-triplet energy splittings from MRCISD-F12+Q); this training scheme is outlined in Figure1. Although this centering of the loss function solely on relative energies stands in contrast to previous work in NeuralXC,21 DeepKS,22 OrbNet,23 and KDFA,29 it has three advantages: (i) it allows benchmark results to be obtained from a variety of different sources (including experiment, which almost always yields relative energies); (ii) relative energies are the quantities of most interest to chemists, since bond energies, energies of reaction, and barrier heights are all relative energies; and (iii) theoretical data used for training is almost always more accurate for relative energies than for absolute energies. For optimization of parameters and hyperparameters, the 360 carbenes were split into a training set of 289 carbenes, a validation set of 37 carbenes, and a test set of 36 carbenes.…”
contrasting
confidence: 57%
“…205,208,237 Machine learning can be of use to enhance and accelerate the quantum chemical method itself, rather than providing a complete substitute such as in MLIP. 168,238 For example, a Dlearning scheme can be used which learns only the difference between a cheap low level method (semi-empirical, classical, etc.) and an accurate high level method (DFT, post-HF etc.).…”
Section: Machine Learning For Molecular Dynamics Simulationsmentioning
confidence: 99%