2020
DOI: 10.1007/978-3-030-40245-7_15
|View full text |Cite
|
Sign up to set email alerts
|

Active Learning and Uncertainty Estimation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
17
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 19 publications
(17 citation statements)
references
References 22 publications
0
17
0
Order By: Relevance
“…While performing ab initio calculations in regular intervals will discover all deviations of the model eventually, this variant of on-the-fly ML does not exploit any information about the already collected reference set and may thus lead to many redundant data points. More detailed reviews on uncertainty estimation and active sampling of PESs can be found in refs ( 226 ) and ( 227 ).…”
Section: Best Practices and Pitfallsmentioning
confidence: 99%
“…While performing ab initio calculations in regular intervals will discover all deviations of the model eventually, this variant of on-the-fly ML does not exploit any information about the already collected reference set and may thus lead to many redundant data points. More detailed reviews on uncertainty estimation and active sampling of PESs can be found in refs ( 226 ) and ( 227 ).…”
Section: Best Practices and Pitfallsmentioning
confidence: 99%
“…According to Eqs. ( 26), (27), and (28), the total ML uncertainty σ on DOS(E) T derives from both the uncertainty on individual DOS predictions, σ a and the uncertainty on the phase space sampling associated with the committee of MLPs driving the dynamics, σ V .…”
Section: Finite-temperature Density Of Statesmentioning
confidence: 99%
“…The uncertainty is estimated on the basis of the spread of the predictions of an ensemble (committee) of independently trained ML models, e.g. by subsampling of the full training dataset [25][26][27][28][29] . These uncertainty quantification schemes provide qualitative information on the reliability of the ML predictions, and are widely used in the context of online and offline active learning, to identify regions of configuration space that need to be added to the training set 7,[30][31][32][33][34][35] .…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Even when uncertainty estimates are available to distinguish informative from uninformative inputs, ML potentials rely on atomistic simulations to generate new trial configurations and bootstrapping a potential becomes an infinite regress problem: the training data for the potential needs to represent the full PES, but thoroughly sampling the PES requires exhaustive sampling, which needs long simulations with a stable accurate potential. It is common to perform molecular dynamics (MD) simulations with NNbased models to expand their training set in an active learning (AL) loop 20,25,38 . MD simulations explore the phase space based on the thermodynamic probability of the PES.…”
mentioning
confidence: 99%