2021
DOI: 10.1103/physrevb.104.104309
|View full text |Cite
|
Sign up to set email alerts
|

Neuroevolution machine learning potentials: Combining high accuracy and low cost in atomistic simulations and application to heat transport

Abstract: We develop a neuroevolution-potential (NEP) framework for generating neural network-based machinelearning potentials. They are trained using an evolutionary strategy for performing large-scale molecular dynamics (MD) simulations. A descriptor of the atomic environment is constructed based on Chebyshev and Legendre polynomials. The method is implemented in graphic processing units within the open-source GPUMD package, which can attain a computational speed over 10 7 atom-step per second using one Nvidia Tesla V… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

5
131
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 131 publications
(136 citation statements)
references
References 90 publications
5
131
0
Order By: Relevance
“…In summary, we have presented the various features of the open-source gpumd package, with a focus on recent developments that have enabled the generation and use of accurate and efficient NEP MLPs. 41,42 Two improvements on the atomic-environment descriptor have been introduced: one is to change the radial functions from Chebyshev basis functions to linear combinations of the basis functions, and the other is to extend the angular descriptor components by considering some 4-body and 5-body contributions as in the ACE approach. 46 Both of these extensions are shown to improve the accuracy of NEP models further.…”
Section: Discussionmentioning
confidence: 99%
“…In summary, we have presented the various features of the open-source gpumd package, with a focus on recent developments that have enabled the generation and use of accurate and efficient NEP MLPs. 41,42 Two improvements on the atomic-environment descriptor have been introduced: one is to change the radial functions from Chebyshev basis functions to linear combinations of the basis functions, and the other is to extend the angular descriptor components by considering some 4-body and 5-body contributions as in the ACE approach. 46 Both of these extensions are shown to improve the accuracy of NEP models further.…”
Section: Discussionmentioning
confidence: 99%
“…However, thermal transport usually involves large length and long time scales and GAP18 is not currently efficient enough for this purpose. The NEP model as implemented in the gpumd package [27,28], on the other hand, can reach a computational speed of about 5 × 10 6 atom-step per second for a-Si by using a single GPU card such as Tesla V100, which is about three orders of magnitude faster than GAP18 using 72 Xeon-Gold 6240 central processing unit (CPU) cores [18].…”
Section: Training a Machine-learned Potential For A-simentioning
confidence: 99%
“…A NEP model has already been trained previously [18] (we call it NEP21), but we here re-train it by changing the relative weight of virial from 1 to 0.1 and keeping all the other hyperparameters as used in Ref. 18 unchanged.…”
Section: Training a Machine-learned Potential For A-simentioning
confidence: 99%
See 2 more Smart Citations