2022
DOI: 10.48550/arxiv.2206.00133
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Pre-training via Denoising for Molecular Property Prediction

Abstract: Many important problems involving molecular property prediction from 3D structures have limited data, posing a generalization challenge for neural networks. In this paper, we describe a pre-training technique that utilizes large datasets of 3D molecular structures at equilibrium to learn meaningful representations for downstream tasks. Inspired by recent advances in noise regularization, our pre-training objective is based on denoising. Relying on the well-known link between denoising autoencoders and score-ma… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
25
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(25 citation statements)
references
References 19 publications
0
25
0
Order By: Relevance
“…The first one is a supervised learning objective, which aims to predict the HOMO-LUMO energy gap of each molecule. Besides, we also use a self-supervised learning objective called 3D Position Denoising Zaidi et al, 2022), which is particularly effective. During training, if a data instance is in the 3D mode, we add Gaussian noise to the position of each atom and require the model to predict the noise from the noisy input.…”
Section: Large-scale Pre-trainingmentioning
confidence: 99%
See 1 more Smart Citation
“…The first one is a supervised learning objective, which aims to predict the HOMO-LUMO energy gap of each molecule. Besides, we also use a self-supervised learning objective called 3D Position Denoising Zaidi et al, 2022), which is particularly effective. During training, if a data instance is in the 3D mode, we add Gaussian noise to the position of each atom and require the model to predict the noise from the noisy input.…”
Section: Large-scale Pre-trainingmentioning
confidence: 99%
“…Prediction head for position output. (3D Position Denoising) Following ; Zaidi et al (2022), we further adopt the 3D Position Denoising task as a self-supervised learning objective. During training, if a data instance is in the 3D mode, we add Gaussian noise to the positions of each atom.…”
Section: A Implementation Details Of Transformer-mmentioning
confidence: 99%
“…Inspired by Noisy Nodes [26] that incorporates denoising (Fig. 3g) as an auxiliary task to improve performance, a recent work [128] adds noise to atomic coordinates of 3D geometry and pre-trains the encoders to predict the noise. They further demonstrate that denoising objective in molecular pre-training is approximately equivalent to learning a molecular force field, shedding light on how denoising aids molecular pre-training.…”
Section: Denoising Modeling (Dm)mentioning
confidence: 99%
“…You et al [125] propose to maximize the agreements between paired molecular graph augmentations using a contrastive objective [9]. Zaidi et al [128] demonstrate that performing denoising on the conformational space can be helpful for learning molecular force fields.…”
Section: Introductionmentioning
confidence: 99%
“…This key observation underpins the significant recent advances in generative diffusion models, which use an estimate of the score-function to generate samples (Ho et al, 2020;Song et al, 2021). The recent success of DAEs in generative modelling has not yet translated to representation learning, with some exceptions (Asiedu et al, 2022;Zaidi et al, 2022). In this work we exploit a denoising autoencoder to eliminate the MAE inefficiency of reconstructing unmasked patches but never using them.…”
Section: Related Workmentioning
confidence: 99%