2021 IEEE International Conference on Robotics and Automation (ICRA) 2021
DOI: 10.1109/icra48506.2021.9561158
|View full text |Cite
|
Sign up to set email alerts
|

Dark Reciprocal-Rank: Teacher-to-student Knowledge Transfer from Self-localization Model to Graph-convolutional Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 46 publications
0
4
0
Order By: Relevance
“…As a representative work of spatial methods, GCN [17] further simplifies graph convolution in the spectral domain by using first-order approximation, which enables graph convolution operations to be carried out in the spatial domain and greatly improves the computational efficiency of graph convolution models. Moreover, to speed up the training of graph neural networks, GNNs Graph-based Knowledge Distillation DKD methods Output layer DKWISL [18] KTG [19] DGCN [20] SPG [21] GCLN [22] Middle layer IEP [23] HKD [24] MHGD [25] IRG [26] DOD [27] HKDIFM [28] KDExplainer [29] TDD [30] DualDE [31] Constructed graph CAG [32] GKD [33] MorsE [34] BAF [35] LAD [36] GD [37] GCMT [38] GraSSNet [39] LSN [40] IntRA-KD [41] RKD [42] CC [43] SPKD [44] KCAN [45] GKD methods…”
Section: Graph Neural Networkmentioning
confidence: 99%
See 3 more Smart Citations
“…As a representative work of spatial methods, GCN [17] further simplifies graph convolution in the spectral domain by using first-order approximation, which enables graph convolution operations to be carried out in the spatial domain and greatly improves the computational efficiency of graph convolution models. Moreover, to speed up the training of graph neural networks, GNNs Graph-based Knowledge Distillation DKD methods Output layer DKWISL [18] KTG [19] DGCN [20] SPG [21] GCLN [22] Middle layer IEP [23] HKD [24] MHGD [25] IRG [26] DOD [27] HKDIFM [28] KDExplainer [29] TDD [30] DualDE [31] Constructed graph CAG [32] GKD [33] MorsE [34] BAF [35] LAD [36] GD [37] GCMT [38] GraSSNet [39] LSN [40] IntRA-KD [41] RKD [42] CC [43] SPKD [44] KCAN [45] GKD methods…”
Section: Graph Neural Networkmentioning
confidence: 99%
“…For example, in DKD methods based on output layer knowledge: DKWISL [18] applies KD to relation extraction in NLP by using the KL distance metric. KTG [19] is used for image recognition applications of collaborative learning by using KL to measure the distribution difference between teachers and students, while GCLN [22] utilizes L 2 to apply KD to the visual robot localization scenarios for image semantic segmentation. Among the methods of DKD based on middlelayer knowledge: IEP [23] combines KL and L1 to apply knowledge to transfer learning and image classification on multi-task learning.…”
Section: Graph-based Knowledge Distillation For Deep Neural Networkmentioning
confidence: 99%
See 2 more Smart Citations