2023
DOI: 10.1609/aaai.v37i4.25573
|View full text |Cite
|
Sign up to set email alerts
|

Beyond Smoothing: Unsupervised Graph Representation Learning with Edge Heterophily Discriminating

Abstract: Unsupervised graph representation learning (UGRL) has drawn increasing research attention and achieved promising results in several graph analytic tasks. Relying on the homophily assumption, existing UGRL methods tend to smooth the learned node representations along all edges, ignoring the existence of heterophilic edges that connect nodes with distinct attributes. As a result, current methods are hard to generalize to heterophilic graphs where dissimilar nodes are widely connected, and also vulnerable to adve… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(3 citation statements)
references
References 36 publications
(58 reference statements)
0
3
0
Order By: Relevance
“…Recently we have witnessed a surge in the robustness of GCN on heterophilic graphs. These methods can be categorized into the structure learning-based ones (Jin et al 2020(Jin et al , 2021aHe et al 2022;Zhu et al 2022;Liu et al 2023), and the ones based on adversarial training (Dai et al 2018;Zhu et al 2019;Zhang, Zhang, and Cheng 2020;Zhang and Zitnik 2020;Suresh et al 2021b). The most related to our work is ProGNN (Jin et al 2020) which explores the low-rank and sparsity of the graph structure, and SimP-GCN (Jin et al 2021b) which relies on a similarity preservation scheme for structure learning.…”
Section: Robust Graph Convolution Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…Recently we have witnessed a surge in the robustness of GCN on heterophilic graphs. These methods can be categorized into the structure learning-based ones (Jin et al 2020(Jin et al , 2021aHe et al 2022;Zhu et al 2022;Liu et al 2023), and the ones based on adversarial training (Dai et al 2018;Zhu et al 2019;Zhang, Zhang, and Cheng 2020;Zhang and Zitnik 2020;Suresh et al 2021b). The most related to our work is ProGNN (Jin et al 2020) which explores the low-rank and sparsity of the graph structure, and SimP-GCN (Jin et al 2021b) which relies on a similarity preservation scheme for structure learning.…”
Section: Robust Graph Convolution Networkmentioning
confidence: 99%
“…Baselines: We follow the previous works (Jin et al 2021b;He et al 2022) to use eleven baselines. We categorize these methods into three groups: 1) multi-hop-based approaches MixHop (Abu-El-Haija et al 2019) and H2GCN (Zhu et al 2020), which mix the multi-hop neighbors for aggregation; 2) ranking-based approaches NLGNN (Liu, Wang, and Ji 2021), GEOM-GCN (Pei et al 2020), Node2Seq (Yuan and Ji 2021) and GPNN (Yang et al 2022) that aim to search on the network structure and then perform selective aggregation; 3) structure learning approaches ProGNN (Jin et al 2020), UGCN (Jin et al 2021a), BM-GCN (He et al 2022) and GREET (Liu et al 2023) that automatically learn graph structures for aggregations. Specifically, ProGNN preserves the low-rank and sparsity characteristics of the graph structure for robust GCN.…”
Section: Experiments Datasets Baselines and Settingsmentioning
confidence: 99%
“…In the meantime, to capture global structural knowledge, we introduce a random walk-based structure embedding (RWSE) which is computed based on the random walk diffusion process (Tong, Faloutsos, and Pan 2006;Dwivedi et al 2021;Liu et al 2023b). Concretely, RWSE is denoted as…”
mentioning
confidence: 99%