2022
DOI: 10.1016/j.neucom.2022.08.022
|View full text |Cite
|
Sign up to set email alerts
|

HIRE: Distilling high-order relational knowledge from heterogeneous graph neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 27 publications
0
6
0
Order By: Relevance
“…GFKD [46] RDD [47] GKD [48] GLNN [49] Distill2Vec [50] MT-GCN [51] TinyGNN [52] GLocalKD [53] SCR [54] ROD [55] EGNN [56] Middle layer LWC-KD [57] MustaD [58] EGAD [59] AGNN [60] Cold Brew [61] PGD [62] OAD [63] CKD [64] BGNN [65] EGSC [66] HSKDM [67] Constructed graph GRL [68] GFL [69] HGKT [70] CPF [71] LSP [16] scGCN [72] MetaHG [73] G-CRD [74] HIRE [75] SKD methods…”
Section: Output Layermentioning
confidence: 99%
See 4 more Smart Citations
“…GFKD [46] RDD [47] GKD [48] GLNN [49] Distill2Vec [50] MT-GCN [51] TinyGNN [52] GLocalKD [53] SCR [54] ROD [55] EGNN [56] Middle layer LWC-KD [57] MustaD [58] EGAD [59] AGNN [60] Cold Brew [61] PGD [62] OAD [63] CKD [64] BGNN [65] EGSC [66] HSKDM [67] Constructed graph GRL [68] GFL [69] HGKT [70] CPF [71] LSP [16] scGCN [72] MetaHG [73] G-CRD [74] HIRE [75] SKD methods…”
Section: Output Layermentioning
confidence: 99%
“…Insight and strength: Compared with the DKD method, the GKD differs most: GNN is a powerful tool for modeling graphs, which can be directly distilled in the middle/output layer and transfer the topological knowledge between graph nodes to the student model. To further explore the relationship between local nodes in the feature space, a lot of efforts have been made to construct the relational graph between nodes in the middle convolutional layer to extract the correlation knowledge between, such as LSP [16], HIRE [75], etc. The successful application of knowledge distillation in GNNs has attracted widespread attention from academia and industry.…”
Section: Graph-based Knowledge Distillation For Graph Neural Networkmentioning
confidence: 99%
See 3 more Smart Citations