2020
DOI: 10.48550/arxiv.2002.07206
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Ripple Walk Training: A Subgraph-based training framework for Large and Deep Graph Neural Network

Abstract: Graph neural networks (GNNs) have achieved outstanding performance in learning graph-structured data. Many current GNNs suffer from three problems when facing large-size graphs and using a deeper structure: neighbors explosion, node dependence, and oversmoothing. In this paper, we propose a general subgraph-based training framework, namely Ripple Walk Training (RWT), for deep and large graph neural networks. RWT samples subgraphs from the full graph to constitute a mini-batch and the full GNN is updated based … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(11 citation statements)
references
References 22 publications
0
11
0
Order By: Relevance
“…Sampling methods used in works [26,[55][56] all satisfy the mechanism that subgraphs are induced from nodes (edges) extension. methods have many differences in various aspects.…”
Section: Subgraph-based Sampling Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Sampling methods used in works [26,[55][56] all satisfy the mechanism that subgraphs are induced from nodes (edges) extension. methods have many differences in various aspects.…”
Section: Subgraph-based Sampling Methodsmentioning
confidence: 99%
“…RWT [55], Parallelized Graph Sampling [56] Heterogeneous sampling method Time-related sampling [61], HetGNN [62],…”
Section: Categories Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Subgraph-based sampling samples a subgraph, which is composed of selected vertices and edges, and conducts training using the subgraph. Existing work generate subgraphs by either partitioning the whole graph or extending vertices and edges using specific policies [2,8,39]. For example, Clus-terGCN [8] first partitions the whole graph into multiple clusters using graph clustering algorithms and then randomly samples a fixed number of clusters as a mini-batch by combining these clusters into a subgraph.…”
Section: Sampling Algorithmsmentioning
confidence: 99%
“…For example, specific patterns of atoms or modes of interaction can be discovered by identifying specific subgraph topologies. Bai et al [12] propose a general subgraph-based training framework referred to as Ripple Walk Training (RWT). This can not only accelerate the training speed on large graphs but also solve problems associated with the memory bottleneck.…”
Section: Introductionmentioning
confidence: 99%