2019 IEEE International Parallel and Distributed Processing Symposium (IPDPS) 2019
DOI: 10.1109/ipdps.2019.00056
|View full text |Cite
|
Sign up to set email alerts
|

Accurate, Efficient and Scalable Graph Embedding

Abstract: The Graph Convolutional Network (GCN) model and its variants are powerful graph embedding tools for facilitating classification and clustering on graphs. However, a major challenge is to reduce the complexity of layered GCNs and make them parallelizable and scalable on very large graphs -state-of the art techniques are unable to achieve scalability without losing accuracy and efficiency. In this paper, we propose novel parallelization techniques for graph samplingbased GCNs that achieve superior scalable perfo… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
172
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 112 publications
(173 citation statements)
references
References 4 publications
(4 reference statements)
1
172
0
Order By: Relevance
“…One GCN layer based on G To accelerate GCN training, the work in [34] proposes parallelization techniques for multi-core platform. It partitions features to increase cache-hit of each core.…”
Section: Deep Learning Training Acceleratorsmentioning
confidence: 99%
See 4 more Smart Citations
“…One GCN layer based on G To accelerate GCN training, the work in [34] proposes parallelization techniques for multi-core platform. It partitions features to increase cache-hit of each core.…”
Section: Deep Learning Training Acceleratorsmentioning
confidence: 99%
“…Ideally, a minibatch should be sampled such that all data required for gradient calculation (e.g., X (ℓ) of the minibatch nodes) fits in BRAM. Among the numerous algorithms [5,6,14,15,34], some return minibatches not suitable for hardware execution. We categorize these algorithms and analyze the hardware cost on external memory accesses.…”
Section: Algorithm Selectionmentioning
confidence: 99%
See 3 more Smart Citations