2020
DOI: 10.26434/chemrxiv.11910948.v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Active Learning Accelerates Ab Initio Molecular Dynamics on Pericyclic Reactive Energy Surfaces

Abstract: Modeling dynamical effects in chemical reactions, such as post-transition state bifurcation, requires ab initio molecular dynamics simulations due to the breakdown of simpler static models like transition state theory. However, these simulations tend to be restricted to lower-accuracy electronic structure methods and scarce sampling because of their high computational cost. Here, we report the use of statistical learning to accelerate reactive molecular dynamics simulations by combining high-throughput ab init… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 40 publications
0
4
0
Order By: Relevance
“…Then, five more generations of active learning were used, with biased MD simulations at temperatures ranging from 600 to 1000 K to force larger deviations from the equilibrium structures, thus ensuring a better configurational sampling. The last generation of the NNP was trained on a complete data set containing 42K revPBE+D3 force calculations on structural models containing from 290 to 323 atoms per supercell, with a diverse set of atomic local environments in which the negative charges arising from Al substitution were compensated with [Cu­(NH 3 ) 2 ] + , NH 4 + , or H + as summarized in Table and plotted in Figure b.…”
Section: Results and Discussionmentioning
confidence: 99%
“…Then, five more generations of active learning were used, with biased MD simulations at temperatures ranging from 600 to 1000 K to force larger deviations from the equilibrium structures, thus ensuring a better configurational sampling. The last generation of the NNP was trained on a complete data set containing 42K revPBE+D3 force calculations on structural models containing from 290 to 323 atoms per supercell, with a diverse set of atomic local environments in which the negative charges arising from Al substitution were compensated with [Cu­(NH 3 ) 2 ] + , NH 4 + , or H + as summarized in Table and plotted in Figure b.…”
Section: Results and Discussionmentioning
confidence: 99%
“…3 Machine learning (ML) approaches have the potential to revolutionize force-field based simulations, aiming to provide the best of both worlds, [4][5][6] and have indeed begun to provide new insights into a range of challenging research problems. [7][8][9][10][11][12][13][14][15] The development of a truly general ML potential mapping nuclear coordinates to total energies and forces is, however, precluded by the curse of dimensionality. Within small chemical subspaces, models can be achieved using neural networks (NNs), 6,[16][17][18][19][20] kernel-based methods such as Gaussian processes (GP) 21,22 and gradient-domain machine learning (GDML), 23 and linear fitting with properly chosen basis functions, 24,25 each with different data requirements and transferability.…”
Section: Introductionmentioning
confidence: 99%
“…Very recently, AL approaches have started to be adopted for fitting reactive potentials for organic molecules based on single point evaluations at quantum-chemical levels of theory. Notable examples include the modelling of gas-phase pericyclic reactions, 13 the exploration of reactivity during methane combustion, 49 and the decomposition of urea in water. 40 Here, we show how ab initio quality GAPs can be trained with hundreds of total ground truth evaluations for liquid water, requiring no a priori knowledge of the system.…”
Section: Introductionmentioning
confidence: 99%
“…An active learning (AL) strategy was used to acquire training data efficiently by carrying out DFT only on areas with high model uncertainty. 16,17 AL augments the data space and automatically improves the model by iteratively acquiring training data and deploying molecular dynamics (MD) simulations with the learned potential. Additional DFT simulations are carried out on points in the trajectory for which the model is uncertain.…”
mentioning
confidence: 99%