2020
DOI: 10.1109/led.2020.2972066
|View full text |Cite
|
Sign up to set email alerts
|

Transfer Learning of Potential Energy Surfaces for Efficient Atomistic Modeling of Doping and Alloy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 32 publications
0
4
0
Order By: Relevance
“…One is transfer learning, which has been developed extensively in the context of articial neural networks, 1 and much of the work in that eld has been brought into chemistry, especially in the development of PESs. [2][3][4] For example, Meuwly and co-workers applied transfer learning using thousands of local CCSD(T) energies to improve their MP2-based neural network PESs for malonaldehyde, acetoacetaldehyde and acetylacetone. 3 The basic idea of transfer learning is that a t obtained from one source of data (perhaps a large one) can be ne-tuned for a related problem by using limited data.…”
Section: Introductionmentioning
confidence: 99%
“…One is transfer learning, which has been developed extensively in the context of articial neural networks, 1 and much of the work in that eld has been brought into chemistry, especially in the development of PESs. [2][3][4] For example, Meuwly and co-workers applied transfer learning using thousands of local CCSD(T) energies to improve their MP2-based neural network PESs for malonaldehyde, acetoacetaldehyde and acetylacetone. 3 The basic idea of transfer learning is that a t obtained from one source of data (perhaps a large one) can be ne-tuned for a related problem by using limited data.…”
Section: Introductionmentioning
confidence: 99%
“…This has been demonstrated in a recent study, in which the neurons act as the tight-binding (TB) matrix elements in the Hamiltonian parameterization of the TB model for energy band calculation [124]. From calculation methodology to case studies, several studies have been reported in which machine-learningaugmented DFT works efficiently with high accuracy in atomistic modeling for devices, including the prediction of atomic force in phase change memory [125], the calculation of potential energy surface in SiGe alloys [126], and the simulation of surface reconstruction of the Si ( 111)-(7 × 7) surface [127].…”
Section: Atomistic Calculation With Aimentioning
confidence: 99%
“…This “gold standard” – CCSD(T) – scales as N 7 (with N being the number of basis functions), 176 which makes calculating energies and forces for large data sets and larger molecules impractical. Thus, TL 50,177–180 and related Δ-learning approaches 170,181–183 gained a lot of attention in recent years and were shown to be data and cost effective alternatives to the “brute force” approach in quantum chemistry: a low level PES based on a large data set of cheap reference data ( e.g. DFT) is generated first, which then is used to obtain a high level PES based on few, well chosen high level of theory ( e.g.…”
Section: Knowledge Transfermentioning
confidence: 99%