2021
DOI: 10.1609/aaai.v35i12.17283
|View full text |Cite
|
Sign up to set email alerts
|

Identity-aware Graph Neural Networks

Abstract: Message passing Graph Neural Networks (GNNs) provide a powerful modeling framework for relational data. However, the expressive power of existing GNNs is upper-bounded by the 1-Weisfeiler-Lehman (1-WL) graph isomorphism test, which means GNNs that are not able to predict node clustering coefficients and shortest path distances, and cannot differentiate between different d-regular graphs. Here we develop a class of message passing GNNs, named Identity-aware Graph Neural Networks (ID-GNNs), with greater expressi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
217
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 202 publications
(282 citation statements)
references
References 30 publications
(17 reference statements)
1
217
1
Order By: Relevance
“…We use NVIDIA RTX 8000 GPU in the experiments. We implement ROLAND 1 with the GraphGym library [46].…”
Section: Bitcoin-otcmentioning
confidence: 99%
See 2 more Smart Citations
“…We use NVIDIA RTX 8000 GPU in the experiments. We implement ROLAND 1 with the GraphGym library [46].…”
Section: Bitcoin-otcmentioning
confidence: 99%
“…Graph Neural Networks (GNNs) are a general class of models that can perform various learning tasks on graphs. GNNs have gained tremendous success in learning from static graphs [1,9,42,45,49]. Although various GNNs have been proposed for dynamic graphs [2,22,24,30,31,34,36,39,47,50] these approaches have limitations of model design, evaluation settings and training strategies.…”
Section: Introduction and Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…2. It computes a graph embedding with general graph convolutions [12], reduces the amount of nodes to one via differential pooling [13] and predicts Q-values for the feature vector of the remaining node. Eventually, the index of the largest Q-value represents the chosen action.…”
Section: Setupmentioning
confidence: 99%
“…Graph Neural Networks (GNNs) have been widely used and have achieved state-of-the-art performance in many related applications, such as node classification [ 5 , 6 , 7 , 8 ], link prediction [ 9 , 10 , 11 ], and graph classification [ 12 , 13 ]. Feature propagation is a simple, efficient, and powerful GNN paradigm [ 14 , 15 ]. The main idea behind it is to obtain new node representations by stacking multiple GNN layers to aggregate the neighbor information of nodes using nonlinear transformations [ 16 ].…”
Section: Introductionmentioning
confidence: 99%