Proceedings of the ACM Web Conference 2023 2023
DOI: 10.1145/3543507.3583324
|View full text |Cite
|
Sign up to set email alerts
|

Graph Neural Networks with Diverse Spectral Filtering

Abstract: Spectral Graph Neural Networks (GNNs) have achieved tremendous success in graph machine learning, with polynomial filters applied for graph convolutions, where all nodes share the identical filter weights to mine their local contexts. Despite the success, existing spectral GNNs usually fail to deal with complex networks (e.g., WWW) due to such homogeneous spectral filtering setting that ignores the regional heterogeneity as typically seen in real-world networks. To tackle this issue, we propose a novel diverse… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 32 publications
0
2
0
Order By: Relevance
“…For Amazon and Coauthor datasets, seven baselines are used as in Table 3. For the MLP with 3-layers, GCN and 3ference, results are obtained from (Luo et al 2022), and a result for DSF comes from (Guo et al 2023). For others, the experiments were performed by randomly splitting the data as 60%/20%/20% for training/validation/testing datasets as in (Luo et al 2022) and replicating it 10 times to obtain mean and standard deviation of the evaluation metric.…”
Section: Semi-supervised Node Classificationmentioning
confidence: 99%
See 1 more Smart Citation
“…For Amazon and Coauthor datasets, seven baselines are used as in Table 3. For the MLP with 3-layers, GCN and 3ference, results are obtained from (Luo et al 2022), and a result for DSF comes from (Guo et al 2023). For others, the experiments were performed by randomly splitting the data as 60%/20%/20% for training/validation/testing datasets as in (Luo et al 2022) and replicating it 10 times to obtain mean and standard deviation of the evaluation metric.…”
Section: Semi-supervised Node Classificationmentioning
confidence: 99%
“…Several recent works tried to overcome this locality issue. Methods in (Veličković et al 2018;Kim et al 2022) leverage attention to capture long-range relationships among nodes, authors in (Gao et al 2019;Wu et al 2022b) develop pooling scheme to compress graphs, and authors in (Chen et al 2020) improve upon the vanilla GCN with skip connection of residuals as in ResNet (He et al 2016).…”
Section: Introductionmentioning
confidence: 99%