2020
DOI: 10.1063/5.0009264
|View full text |Cite
|
Sign up to set email alerts
|

Using principal component analysis for neural network high-dimensional potential energy surface

Abstract: Charge transfer (CT) at avoided crossings of excited ionized states of argon dimers is observed using a two-color pump-probe experiment at the free-electron laser in Hamburg (FLASH). The process is initiated by the absorption of three 27-eV-photons from the pump pulse, which leads to the population of Ar 2+ *-Ar states. Due to nonadiabatic coupling between these one-site doubly ionized states and two-site doubly ionized states of the type Ar + *-Ar + , CT can take place leading to the population of the latter … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 12 publications
(10 citation statements)
references
References 60 publications
0
10
0
Order By: Relevance
“…Alternatively, for a given data set, optimum symmetry functions sets can be generated systematically by selecting the best functions from a large pool of functions 142 or by principal component analysis. 143 This can be done in a highly automatic way, but the drawback of this approach is the dependence on the underlying data set, which is often not static, but changes in particular in early stages of potential development, when new structures are regularly added (see section 6.2). Moreover, while the known structures may be efficiently distinguished by the symmetry functions generated in this way, new and possibly very different types of structures emerging in simulations may not be equally well described and may require a redefinition of the symmetry functions and consequently a new training of the HDNNP.…”
Section: Discussion Of Acsfsmentioning
confidence: 99%
See 1 more Smart Citation
“…Alternatively, for a given data set, optimum symmetry functions sets can be generated systematically by selecting the best functions from a large pool of functions 142 or by principal component analysis. 143 This can be done in a highly automatic way, but the drawback of this approach is the dependence on the underlying data set, which is often not static, but changes in particular in early stages of potential development, when new structures are regularly added (see section 6.2). Moreover, while the known structures may be efficiently distinguished by the symmetry functions generated in this way, new and possibly very different types of structures emerging in simulations may not be equally well described and may require a redefinition of the symmetry functions and consequently a new training of the HDNNP.…”
Section: Discussion Of Acsfsmentioning
confidence: 99%
“…Alternatively, for a given data set, optimum symmetry functions sets can be generated systematically by selecting the best functions from a large pool of functions or by principal component analysis . This can be done in a highly automatic way, but the drawback of this approach is the dependence on the underlying data set, which is often not static, but changes in particular in early stages of potential development, when new structures are regularly added (see section ).…”
Section: Second-generation Neural Network Potentialsmentioning
confidence: 99%
“…Some more advanced methods for mapping the multi-dimensional thermal analysis data for the input layer of neural network still need to be developed. Muravyev and Pivkina [ 21 ] propose using principal component analysis [ 57 , 58 , 59 ] or introducing additional hidden layers for this purpose. To the best of authors’ knowledge, no more advanced methods of introduction of TA data to ANN have been proposed so far.…”
Section: Technical Details Behind Annsmentioning
confidence: 99%
“…[2][3][4] These methods are of a great interest for theoreticians because they allow for the analysis, classification and prediction of various properties conventionally requiring a large amount of data generated via computationally demanding quantum mechanical calculations. 5 Indeed, machine learning techniques can be applied to a broad range of problems, including, among the others, potential energy surface fitting, [6][7][8][9] ab initio molecular dynamics, [10][11][12] prediction of various scalar properties [13][14][15] (e.g., atomization energies, polarizability coefficients, highest occupied molecular orbital energies, electronic structure correlation energies, etc), vectorial and tensorial quantities (e.g., forces, polarizability tensors, etc), 16,17 and spectra. [18][19][20] The chemical compound space is characteristic by a huge dimensionality and complexity.…”
Section: Introductionmentioning
confidence: 99%