2024
DOI: 10.1088/1742-6596/2759/1/012010
|View full text |Cite
|
Sign up to set email alerts
|

Distilling Structural Knowledge for Platform-Aware Semantic Segmentation

Guilin Li,
Qiang Wang,
Xiawu Zheng

Abstract: Knowledge Distillation (KD) aims to distill the dark knowledge of a high-powered teacher network into a student network, which can improve the capacity of student network and has been successfully applied to semantic segmentation. However, the standard knowledge distillation approaches merely represent the supervisory signal of teacher network as the dark knowledge, while ignoring the impact of network architecture during distillation. In this paper, we found that the student network with a more similar archit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 21 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?