2020
DOI: 10.1063/5.0014677
|View full text |Cite
|
Sign up to set email alerts
|

Neural network potential from bispectrum components: A case study on crystalline silicon

Abstract: In this article, we present a systematic study on developing machine learning force fields (MLFFs) for crystalline silicon. While the main-stream approach of fitting a MLFF is to use a small and localized training set from molecular dynamics simulations, it is unlikely to cover the global features of the potential energy surface. To remedy this issue, we used randomly generated symmetrical crystal structures to train a more general Si-MLFF. Furthermore, we performed substantial benchmarks among different choic… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 15 publications
(6 citation statements)
references
References 51 publications
0
6
0
Order By: Relevance
“…The fitting of ML potentials is performed in descriptor space and the (statistical) ML procedure of the fit defines the performance and limitations of the potential. The relationship between atomic energies and components of the descriptors can be linear [35][36][37][38][39][40][41] or nonlinear [11,[41][42][43][44][45][46][47][48][49][50][51][52][53]. The linear model does not imply a linear relation between the phase space and the observable.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The fitting of ML potentials is performed in descriptor space and the (statistical) ML procedure of the fit defines the performance and limitations of the potential. The relationship between atomic energies and components of the descriptors can be linear [35][36][37][38][39][40][41] or nonlinear [11,[41][42][43][44][45][46][47][48][49][50][51][52][53]. The linear model does not imply a linear relation between the phase space and the observable.…”
Section: Introductionmentioning
confidence: 99%
“…Any non-linear regression becomes linear if the domain of the function is projected into a space with a sufficiently large number of dimensions [54,55]. Non-linear models are most commonly based on NN [11,[41][42][43][44] or kernel methods [46][47][48][49][50][51][52][53]. Using a linear kernel is equivalent to performing a linear regression while a polynomial kernel is equivalent to linear regression with a basis set formed from outer products of the elements of the feature vectors [56].…”
Section: Introductionmentioning
confidence: 99%
“…Behler-Parrinello 14 and bispectrum coefficients 35 are examples of the descriptors used in MLIPs. 36 MLIPs generally include large sets of descriptors, in order to ensure a reliable description of reasonable atomic environments. The regression model is the second basic element of a MLIP, which can be based on linear/polynomial regression, kernel methods, or artificial neural networks.…”
Section: Training a Mlip For The Analysis Of Mechanical Propertiesmentioning
confidence: 99%
“…Compared to GAP, NN is more suitable for large scale simulation due to its better scalability. Very recently, we have developed the NN version of spectral neighbor analysis potential (NN-SNAP) [24][25][26][27] based on the bispectrum coefficient descriptors 15,28 and implemented them to the ML-IAP package inside the LAMMPS software 29 . To train an accurate NN-SNAP model for describing the GaN's B4-B1 transition, we start with the existing dataset from a recent work 16 .…”
mentioning
confidence: 99%
“…S5-S7), there is a clear evidence of nucleus formation at the critical stage. The B1 nucleus for N =2000 grows rapidly and completes the entire transition within only one metastep (25) with an energy barrier of 0.215 eV/atom. The nucleation process is seen to be particularly effective in further reducing the barrier in larger systems.…”
mentioning
confidence: 99%