2019
DOI: 10.1021/acscentsci.8b00913
|View full text |Cite
|
Sign up to set email alerts
|

Machine Learning of Coarse-Grained Molecular Dynamics Force Fields

Abstract: Atomistic or ab initio molecular dynamics simulations are widely used to predict thermodynamics and kinetics and relate them to molecular structure. A common approach to go beyond the time- and length-scales accessible with such computationally expensive simulations is the definition of coarse-grained molecular models. Existing coarse-graining approaches define an effective interaction potential to match defined properties of high-resolution models or experimental data. In this paper, we reformulate coarse-gra… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

2
573
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 394 publications
(581 citation statements)
references
References 83 publications
2
573
0
1
Order By: Relevance
“…that is, the projection of the atomistic force ∇ x U on the collective variable space through the mapping E. In practice, the estimator (8) is very noisy: because of the dimensionality reduction from x to y, multiple realizations of the projected force f y lmf can be associated to the same value of the collective coordinates y and the minimum of the loss function (8) can not go to zero. By invoking statistical estimator theory it can be shown that this loss can be broken down into a bias, variance and noise terms [18].…”
Section: Free Energy Surfacesmentioning
confidence: 99%
See 4 more Smart Citations
“…that is, the projection of the atomistic force ∇ x U on the collective variable space through the mapping E. In practice, the estimator (8) is very noisy: because of the dimensionality reduction from x to y, multiple realizations of the projected force f y lmf can be associated to the same value of the collective coordinates y and the minimum of the loss function (8) can not go to zero. By invoking statistical estimator theory it can be shown that this loss can be broken down into a bias, variance and noise terms [18].…”
Section: Free Energy Surfacesmentioning
confidence: 99%
“…Rather, it is about enforcing the correct asymptotic behavior of the energy when going towards an unphysical limit. CGnets proposed to achieve this by learning the difference to a simple prior energy that was defined to have the correct asymptotic behavior [18] (Fig. 5a).…”
Section: Coarse-graining: Cgnetsmentioning
confidence: 99%
See 3 more Smart Citations