2021
DOI: 10.48550/arxiv.2106.07971
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Simple GNN Regularisation for 3D Molecular Property Prediction & Beyond

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(16 citation statements)
references
References 0 publications
0
15
0
Order By: Relevance
“…For our GNN architecture, we adopt the Encode-Process-Decode logic that is now popular for learning on graph-based physics problems Godwin et al, 2022), albeit with some notable modifications. Figure 3 shows an overview of the Flow Reconstruction GNN.…”
Section: Architecturementioning
confidence: 99%
“…For our GNN architecture, we adopt the Encode-Process-Decode logic that is now popular for learning on graph-based physics problems Godwin et al, 2022), albeit with some notable modifications. Figure 3 shows an overview of the Flow Reconstruction GNN.…”
Section: Architecturementioning
confidence: 99%
“…As demonstrated by Wu, Zhang, Jin, Jiang, and Li, [79] the denoising diffusion architecture [80][81][82] had a strong connectivity with the enhanced sampling method in MD, [83][84][85][86] where energy was injected into the microscopic system to smooth biomolecular potential energy surface and decrease energy barriers. Besides, it had been shown in Godwin et al [33] that the simple noise regularization could be an effective way to address oversmoothing. [87] A noise correction target could be added to prevent oversmoothing by enforcing diversity in the last few layers of the GNN, which was implemented with an auxiliary denoising autoencoder loss.…”
Section: Time-series Prompting For Motion Predictionmentioning
confidence: 99%
“…To mitigate the over-smoothing problem, we used multiple aggregation functions through the message-passing scheme, [45] along with residual connections. [46,47] The encoder is constructed by stacking several propagation layers followed by a ReLU activation function. Node embeddings are then concatenated with a residual connection from the input graph.…”
Section: Lone Pair Predictionmentioning
confidence: 99%
“…Multiple solutions were proposed: residual connections augmentations. [46,47] In this work, we concatenate outputs of intermediate layers tackling both the over smoothing and vanishing gradients problems.…”
Section: Learning Natural Bond Interactionsmentioning
confidence: 99%