Proceedings of the Web Conference 2021 2021
DOI: 10.1145/3442381.3449896
|View full text |Cite
|
Sign up to set email alerts
|

Improving Graph Neural Networks with Structural Adaptive Receptive Fields

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(4 citation statements)
references
References 12 publications
0
4
0
Order By: Relevance
“…However, these GNNs focus on fixed graphs with feature-level similarities and cannot adapt to spatiotemporal graphs. To empower an adaptive topology, AGCRN [31] learns a dynamic topology based on feature-wise product, while STAR-GNN [46] and L2P framework [47] respectively design a mutual-information based strategy and a generative process, to estimate the optimal receptive fields for GNNs. More recently, exogenous context factors those are out-of-graph have been demonstrated to potentially interfere the intrinsic graph topology and have complicated interactions on element-wise aggregations [25], [27], [48].…”
Section: F Hyperparameter Settingsmentioning
confidence: 99%
“…However, these GNNs focus on fixed graphs with feature-level similarities and cannot adapt to spatiotemporal graphs. To empower an adaptive topology, AGCRN [31] learns a dynamic topology based on feature-wise product, while STAR-GNN [46] and L2P framework [47] respectively design a mutual-information based strategy and a generative process, to estimate the optimal receptive fields for GNNs. More recently, exogenous context factors those are out-of-graph have been demonstrated to potentially interfere the intrinsic graph topology and have complicated interactions on element-wise aggregations [25], [27], [48].…”
Section: F Hyperparameter Settingsmentioning
confidence: 99%
“…AGFormer (Jiang et al 2023) first selects representative anchors and then transforms node-to-node message passing into an anchor-to-anchor and anchor-to-node message passing process. Moreover, AGT (Ma et al 2023b) extracts the structural patterns from subgraph views and designs an adaptive transformer block to dynamically integrate attention scores in a node-specific manner.…”
Section: Introductionmentioning
confidence: 99%
“…Graph neural networks (GNNs) [56] have become the state-of-theart methods in many graph representation learning scenarios such as node classification [8,31,32,61], link prediction [3,14,47,54], recommendation [17,34,53,58], and knowledge graphs [1,48,49,51]. Most GNN pipelines can be described in terms of the neural message passing (NMP) framework [15], which is based on the core idea of recursive neighborhood aggregation and transformation.…”
Section: Introductionmentioning
confidence: 99%