2023
DOI: 10.48550/arxiv.2303.08360
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Knowledge Distillation from Single to Multi Labels: an Empirical Study

Abstract: Knowledge distillation (KD) has been extensively studied in single-label image classification. However, its efficacy for multi-label classification remains relatively unexplored. In this study, we firstly investigate the effectiveness of classical KD techniques, including logit-based and feature-based methods, for multi-label classification. Our findings indicate that the logit-based method is not wellsuited for multi-label classification, as the teacher fails to provide inter-category similarity information o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 59 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?