2020
DOI: 10.1021/acs.jctc.0c00217
|View full text |Cite
|
Sign up to set email alerts
|

Incorporating Electronic Information into Machine Learning Potential Energy Surfaces via Approaching the Ground-State Electronic Energy as a Function of Atom-Based Electronic Populations

Abstract: Machine learning (ML) approximations to density functional theory (DFT) potential energy surfaces (PESs) are showing great promise for reducing the computational cost of accurate molecular simulations, but at present, they are not applicable to varying electronic states, and in particular, they are not well suited for molecular systems in which the local electronic structure is sensitive to the medium to long-range electronic environment. With this issue as the focal point, we present a new machine learning ap… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
83
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 78 publications
(83 citation statements)
references
References 127 publications
(166 reference statements)
0
83
0
Order By: Relevance
“…This method has enabled the inclusion of long-range charge transfer in a ML framework for the first time, but due to the employed energy expression this method is primarily applicable to ionic systems [35][36][37] , and the overall accuracy is still lower than in case of other state-of-the-art ML potentials. Recently, another promising method has been proposed by Xie, Persson and Small 38 aiming for a correct description of systems with different charge states. In this method, atomic neural networks are used that do not only depend on the local structure but also on atomic populations, which are determined in a self-consistent process.…”
mentioning
confidence: 99%
“…This method has enabled the inclusion of long-range charge transfer in a ML framework for the first time, but due to the employed energy expression this method is primarily applicable to ionic systems [35][36][37] , and the overall accuracy is still lower than in case of other state-of-the-art ML potentials. Recently, another promising method has been proposed by Xie, Persson and Small 38 aiming for a correct description of systems with different charge states. In this method, atomic neural networks are used that do not only depend on the local structure but also on atomic populations, which are determined in a self-consistent process.…”
mentioning
confidence: 99%
“…To our knowledge, this is a primary example where the ML model provides a consistent and qualitatively correct physical behavior between molecular geometry, energy, integral molecular charge, and partial atomic charges. Upon submitting this manuscript we learned about work by Xie, 61 where ML model built to predict energy as function of electron populations in prototypical LiH clusters. Other schemes like BP, 12 TensorMol, 15 HIP-NN, 62,63 and PhysNet 18 typically employ auxiliary neural network that predicts atomic charges from a local geometrical descriptor.…”
Section: Resultsmentioning
confidence: 99%
“…Due to the charge equilibration step, the naive implementation is cubically scaling with system size, but using iterative schemes close to linear scaling can be achieved. Another related method is the Becke population neural network (BpopNN) [67]. This method relies on atomic neural networks using modified SOAP descriptors including also the atomic populations, i.e.…”
Section: Beyond Locality-long-ranged Mlpsmentioning
confidence: 99%