2022
DOI: 10.1109/tcad.2022.3197342
|View full text |Cite
|
Sign up to set email alerts
|

Accelerating Large-Scale Graph Neural Network Training on Crossbar Diet

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 31 publications
0
2
0
Order By: Relevance
“…[46] proposes GCoD, a GCN algorithm and accelerator Co-Design framework, involving a two-pronged accelerator with a separated engine to process dense and sparse workloads. Some previous studies focus on accelerating GNN training [48,49,10,50]. GraphACT [48] introduces an FPGA-based accelerator with a subgraph-based algorithm for Graph Convolutional Networks (GCNs) training.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…[46] proposes GCoD, a GCN algorithm and accelerator Co-Design framework, involving a two-pronged accelerator with a separated engine to process dense and sparse workloads. Some previous studies focus on accelerating GNN training [48,49,10,50]. GraphACT [48] introduces an FPGA-based accelerator with a subgraph-based algorithm for Graph Convolutional Networks (GCNs) training.…”
Section: Related Workmentioning
confidence: 99%
“…[10] proposes HP-GNN which maps GNN training on the CPU-FPGA platform automatically. DietGNN [50], a crossbaraware pruning technique, is proposed to accelerate the training of large-scale GNNs.…”
Section: Related Workmentioning
confidence: 99%