Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence 2021
DOI: 10.24963/ijcai.2021/309
|View full text |Cite
|
Sign up to set email alerts
|

Learning Attributed Graph Representation with Communicative Message Passing Transformer

Abstract: Constructing appropriate representations of molecules lies at the core of numerous tasks such as material science, chemistry, and drug designs. Recent researches abstract molecules as attributed graphs and employ graph neural networks (GNN) for molecular representation learning, which have made remarkable achievements in molecular graph modeling. Albeit powerful, current models either are based on local aggregation operations and thus miss higher-order graph properties or focus on only node information without… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
44
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 36 publications
(44 citation statements)
references
References 1 publication
0
44
0
Order By: Relevance
“…In order to fulfill the topology information of molecular graphs, some recent works [Kearnes et al ., 2016; Xiong et al ., 2020] attempt to apply GNNs to molecular representation learning. Besides, MPNN [Gilmer and others., 2017] and its variants DMPNN [Yang et al ., 2019], CMPNN [Song et al ., 2020], CoMPT [Chen et al ., 2021] utilize a message passing framework to better capture the inter-actions among atoms. However, they still require expensive annotations and barely generalize to unseen molecules, which pose hurdle to the practical applications.…”
Section: Related Workmentioning
confidence: 99%
“…In order to fulfill the topology information of molecular graphs, some recent works [Kearnes et al ., 2016; Xiong et al ., 2020] attempt to apply GNNs to molecular representation learning. Besides, MPNN [Gilmer and others., 2017] and its variants DMPNN [Yang et al ., 2019], CMPNN [Song et al ., 2020], CoMPT [Chen et al ., 2021] utilize a message passing framework to better capture the inter-actions among atoms. However, they still require expensive annotations and barely generalize to unseen molecules, which pose hurdle to the practical applications.…”
Section: Related Workmentioning
confidence: 99%
“…The global receptive fields of the self-attention and the local message passing of the graph neural network are inherently complementary and compatible. Attempts have been made in ways of incorporating graph information into the self-attention computation [35,39,40] and integrating the conventional graph neural networks [12] with Transformer architecture [11,4].…”
Section: Graph Transformermentioning
confidence: 99%
“…The prediction of molecular property has been widely considered as one of the most significant tasks in computational drug and material discovery (Goh, Hodas, and Vishnu 2017;Wu et al 2018;Chen et al 2018). Accurately predicting the property can help to evaluate and select the appropriate chemical molecules with desired characteristics for many downstream applications (Xiong et al 2019;Yang et al 2019;Song et al 2020;Chen et al 2021).…”
Section: Introductionmentioning
confidence: 99%
“…With the remarkable success of graph neural networks (GNNs) in various graph-related tasks in recent years (Wu et al 2020), a number of efforts have been made from different directions to design GNN models for molecular property prediction like (Yang et al 2019;Danel et al 2020;Maziarka et al 2020;Song et al 2020;Chen et al 2021). The fundamental idea is to regard the topology of atoms and bonds as a graph, and translate each molecule to a representation vector with powerful GNN encoders, followed by the prediction model for the specific property.…”
Section: Introductionmentioning
confidence: 99%