2022
DOI: 10.1038/s42256-022-00447-x
|View full text |Cite
|
Sign up to set email alerts
|

Molecular contrastive learning of representations via graph neural networks

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
296
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 363 publications
(354 citation statements)
references
References 40 publications
2
296
0
Order By: Relevance
“…D-MPNN [49] and AttentiveFP [50] are supervised GNNs methods. N-gram [51], PretrainGNN [22], GROVER [11], GraphMVP [26], MolCLR [12], and GEM [13] are pretraining methods. N-gram embeds the nodes in the graph and assembles them in short walks as the graph representation.…”
Section: Molecular Property Predictionmentioning
confidence: 99%
See 1 more Smart Citation
“…D-MPNN [49] and AttentiveFP [50] are supervised GNNs methods. N-gram [51], PretrainGNN [22], GROVER [11], GraphMVP [26], MolCLR [12], and GEM [13] are pretraining methods. N-gram embeds the nodes in the graph and assembles them in short walks as the graph representation.…”
Section: Molecular Property Predictionmentioning
confidence: 99%
“…From the perspective of life science, the properties of molecules and the effects of drugs are mostly determined by their 3D structures [14, 15]. In most current MRL methods, one starts with representing molecules as 1D sequential strings, such as SMILES [16,17,18] and InChI [19,20,21], or 2D graphs [22,11,23,12,24]. This may limit their ability to incorporate 3D information for downstream tasks.…”
Section: Introductionmentioning
confidence: 99%
“…As an alternative approach, contrastive learning has been applied to bias reduction [37] with the aim to learn a high-quality representation via a selfsupervised pretext task. This technique has been widely applied in various domains, including computer vision [38], graph data [39] and molecular data [40]. With an appropriate proposal distribution for negative sampling, this framework simultaneously improves the discriminative power of the model and reduces exposure bias.…”
Section: Related Workmentioning
confidence: 99%
“…The chemical space that a drug candidate lies in is vast, while drug-related labeled data is limited. Not surprisingly, compared with traditional molecular fingerprint based models [9,10], recent molecular representation learning (MRL) models perform much better in most property prediction tasks [11,12,13]. However, to further improve the performance and extend the application scope of existing MRL models, one is faced with a critical issue.…”
Section: Introductionmentioning
confidence: 99%
“…From the perspective of life science, the properties of molecules and the effects of drugs are mostly determined by their 3D structures [14, 15]. In most current MRL methods, one starts with representing molecules as 1D sequential strings, such as SMILES [16,17,18] and InChI [19,20,21], or 2D graphs [22,11,23,12]. This may limit their ability to incorporate 3D information for downstream tasks.…”
Section: Introductionmentioning
confidence: 99%