2023
DOI: 10.26434/chemrxiv-2023-v0dwk
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

GraphVAMPnets for Uncovering Slow Collective Variables of Self-Assembly Dynamics

Abstract: Uncovering slow collective variables (CVs) of self-assembly dynamics is important to elucidate its numerous kinetic assembly pathways and drive the design of novel structures for advanced materials through the bottom-up approach. However, identifying the CVs for self-assembly presents several challenges. First, self-assembly systems often consist of identical monomers and the feature representations should be invariant to permutations and rotational symmetries. Physical coordinates, such as aggregate size, lac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 59 publications
0
1
0
Order By: Relevance
“…At the level of the network architecture, Winter, Noé, and Clevert developed a permutation-invariant graph autoencoder (PIGAE) for graph reconstruction by employing an explicit permuter that learns an explicit permutation matrix for each input graph, although the quartic scaling in reordering operations during each training step makes this approach expensive . Similarly, Huang and co-workers developed a GraphVAMPnets-based approach that utilizes graph neural networks to respect permutation and rotational symmetries of particles in self-assembling systems by enforcing identical node embeddings for such symmetric particles and learn the slow CVs from the resulting graph embeddings using VAMPnets. Prudente, Acioli, and Soares Neto and Nguyen and Le developed special purpose neural networks for the fitting of potential energy surfaces of small molecules in which the first layer of the network was modified to respect permutational invariance, although generalization and scaling to larger systems are challenging.…”
Section: Introductionmentioning
confidence: 99%
“…At the level of the network architecture, Winter, Noé, and Clevert developed a permutation-invariant graph autoencoder (PIGAE) for graph reconstruction by employing an explicit permuter that learns an explicit permutation matrix for each input graph, although the quartic scaling in reordering operations during each training step makes this approach expensive . Similarly, Huang and co-workers developed a GraphVAMPnets-based approach that utilizes graph neural networks to respect permutation and rotational symmetries of particles in self-assembling systems by enforcing identical node embeddings for such symmetric particles and learn the slow CVs from the resulting graph embeddings using VAMPnets. Prudente, Acioli, and Soares Neto and Nguyen and Le developed special purpose neural networks for the fitting of potential energy surfaces of small molecules in which the first layer of the network was modified to respect permutational invariance, although generalization and scaling to larger systems are challenging.…”
Section: Introductionmentioning
confidence: 99%