2018
DOI: 10.1063/1.5025668
|View full text |Cite
|
Sign up to set email alerts
|

Can exact conditions improve machine-learned density functionals?

Abstract: Historical methods of functional development in density functional theory have often been guided by analytic conditions that constrain the exact functional one is trying to approximate. Recently, machine-learned functionals have been created by interpolating the results from a small number of exactly solved systems to unsolved systems that are similar in nature. For a simple one-dimensional system, using an exact condition, we find improvements in the learning curves of a machine learning approximation to the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
51
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8
1
1

Relationship

3
7

Authors

Journals

citations
Cited by 55 publications
(55 citation statements)
references
References 37 publications
0
51
0
Order By: Relevance
“…The original ML model for the KEDF has been shown to successfully describe bond breaking 26 and was extended to include basis set independence 27 as well as scale-invariance conditions. 28 The same ML model has also been employed for direct fits of F [ n ], the universal part of the total energy density functional. 29 A very interesting ML model was investigated by Yao and Parkhill, who used a 1D convolutional neural network to fit the kinetic energy as a function of the density projected onto bond directions.…”
Section: Introductionmentioning
confidence: 99%
“…The original ML model for the KEDF has been shown to successfully describe bond breaking 26 and was extended to include basis set independence 27 as well as scale-invariance conditions. 28 The same ML model has also been employed for direct fits of F [ n ], the universal part of the total energy density functional. 29 A very interesting ML model was investigated by Yao and Parkhill, who used a 1D convolutional neural network to fit the kinetic energy as a function of the density projected onto bond directions.…”
Section: Introductionmentioning
confidence: 99%
“…In this paper, we describe an approach for generating an ML framework that satisfies the criteria outlined above. The ML model employed in this work is kernel ridge regression (KRR), the basic principles of which in the construction of density functionals have been developed over several years [63][64][65][66][67][68][69] . In order to advance our ML framework 53 to the prediction of coupledcluster (CC) energies, as opposed to DFT energies, one need only recognize that the basic ML construction procedure is independent of the source of inputs.…”
mentioning
confidence: 99%
“…Non-linearity of the NN fit was mild (most of the KED variance could be captured by a linear fit); as a result, the exact conditions were, if one can say so, 'nearly satisfied' . Burke and co-workers showed on one-dimensional synthetic problems that the use of scaled variables may or may not improve machine-learned KEFs [120]; they used the KRR method. It appears that using non-scaled variables is a useful line of exploration which should not be ignored, as exact uniform density scaling does not happen in Nature.…”
Section: Kinetic Energy Functionalsmentioning
confidence: 99%