2018
DOI: 10.1021/acs.jctc.8b00025
|View full text |Cite
|
Sign up to set email alerts
|

Transferable Neural Networks for Enhanced Sampling of Protein Dynamics

Abstract: Variational autoencoder frameworks have demonstrated success in reducing complex nonlinear dynamics in molecular simulation to a single nonlinear embedding. In this work, we illustrate how this nonlinear latent embedding can be used as a collective variable for enhanced sampling and present a simple modification that allows us to rapidly perform sampling in multiple related systems. We first demonstrate our method is able to describe the effects of force field changes in capped alanine dipeptide after learning… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
102
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 96 publications
(102 citation statements)
references
References 59 publications
0
102
0
Order By: Relevance
“…It also requires O(N ) memory and O(N ) computation time, which makes it amenable to large data sets such as those commonly encountered in biomlecular simulations. The neural network architecture does not require the selection of a kernel function or adjustment of hyperparameters that can strongly affect the quality of the results and be tedious and challenging to tune [21,22]. In fact, we find that training such a simple fully-connected feed-forward neural network is simple, cheap, and insensitive to batch size, learning rate, and architecture.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…It also requires O(N ) memory and O(N ) computation time, which makes it amenable to large data sets such as those commonly encountered in biomlecular simulations. The neural network architecture does not require the selection of a kernel function or adjustment of hyperparameters that can strongly affect the quality of the results and be tedious and challenging to tune [21,22]. In fact, we find that training such a simple fully-connected feed-forward neural network is simple, cheap, and insensitive to batch size, learning rate, and architecture.…”
Section: Discussionmentioning
confidence: 99%
“…However, the high variance directions are not guaranteed to also correspond to the slow directions of the dynamics. Only variational dynamics encoders have been used to learn and bias sampling in a slow CV [22], but, as observed above, the VDE is limited to approximate only the leading eigenfunction of the transfer operator. SRVs open the door to performing accelerated sampling within the full spectrum of all relevant eigenfunctions of the transfer operator.…”
Section: Discussionmentioning
confidence: 99%
“…This would make the entire process into an online learning setup where each additional round of simulation improves the dividing hyper boundary. Additionally, these coordinates are likely to be transferable 15,17 across related systems which would make them useful for investigating drug binding/unbinding kinetics, mutational effects, modest force field effects etc. However, when and where transfer learning might fail is an unsolved problem.…”
Section: Discussionmentioning
confidence: 99%
“…DNNs are universal function approximators. Typical DNNs consist of a series of fully connected affine transformation layers ( Figure 1d) interspersed with non-linear activation function, such as the Sigmoid, ReLU, Swish 35 .Previous works have already highlighted the expressive power of neural networks for dimensionality reduction 16,36 and sampling 17,37 . Here we argue that given some labeled state/trajectory data, the un-normalized output from these networks could now be used as a set of differentiable collective variables for accelerating molecular simulations.…”
Section: Incorporating Non-linearity Via Kernels or Deep Neural Netwomentioning
confidence: 99%
“…A second interesting recent direction has involved applying the deep learning techniques that have proved so successful in a range of fields to biophysical problems. In particular, a number of recent articles have used autoencoder neural networks to construct collective coordinates that can be used both to analyze molecular dynamics trajectories and as a collective variable for metadynamics simulations [78,79].…”
Section: Discussionmentioning
confidence: 99%