2021
DOI: 10.48550/arxiv.2109.08830
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Multilingual Molecular Representation Learning via Contrastive Pre-training

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 0 publications
0
5
0
Order By: Relevance
“…Early approaches to molecular representation learning predominantly focused on 1D SMILES (Wang et al, 2019; Chithrananda et al, 2020; Guo et al, 2021; Honda et al, 2019) and 2D graphs (Li et al, 2021; Lu et al, 2021; Fang et al, 2022b; Xia et al, 2022). Recently, there has been a growing interest in 3D molecular data, which could provide a more comprehensive reflection of physical properties, including information not captured by 1D and 2D data, such as conformation details.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Early approaches to molecular representation learning predominantly focused on 1D SMILES (Wang et al, 2019; Chithrananda et al, 2020; Guo et al, 2021; Honda et al, 2019) and 2D graphs (Li et al, 2021; Lu et al, 2021; Fang et al, 2022b; Xia et al, 2022). Recently, there has been a growing interest in 3D molecular data, which could provide a more comprehensive reflection of physical properties, including information not captured by 1D and 2D data, such as conformation details.…”
Section: Related Workmentioning
confidence: 99%
“…Pre-training based molecular representation learning has shown remarkable performance across various molecular understanding tasks, such as drug discovery (Pinzi & Rastelli, 2019;Adelusi et al, 2022), molecular property prediction (Luo et al, 2022;Liu et al, 2022b;Zhou et al, 2023;Yu et al, 2023) and reaction prediction (Gastegger et al, 2021;Schwaller et al, 2021). Early approaches tend to model 1D SMILES (Wang et al, 2019;Guo et al, 2021;Honda et al, 2019) or 2D graphs (Li et al, 2021;Lu et al, 2021;Fang et al, 2022b;Xia et al, 2022). More recently, there has been a growing interest in 3D molecular data, with its inclusion of 3D structure information providing more comprehensive information of molecules.…”
Section: Introductionmentioning
confidence: 99%
“…Transformers have been widely applied for molecular representation learning. However, many works either do not construct a readily available latent space for optimization, as in Uni-Mol [18], or construct a latent space, but do not implement a decoder for generating a molecule, as in KPGT [19], MM-Deacon [20], and GeoT [21]. We require a latent space, along with a decoder, to construct a decision space for optimization, along with the ability to generate a molecule from the vectorized latent representation.…”
Section: Related Workmentioning
confidence: 99%
“…Transformers have been widely applied for molecular representation learning. However, many works either do not construct a readily available latent space for optimization, as in Uni-Mol [18], or construct a latent space, but do not implement a decoder for generating a molecule, as in KPGT [19], MM-Deacon [20], and GeoT [21]. We require a latent space, along with a decoder, to construct a decision space for optimization, along with the ability to generate a molecule from the vectorized latent representation.…”
Section: Related Workmentioning
confidence: 99%