2021
DOI: 10.48550/arxiv.2107.06993
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Confidence Conditioned Knowledge Distillation

Abstract: In this paper, a novel confidence conditioned knowledge distillation (CCKD) scheme for transferring the knowledge from a teacher model to a student model is proposed.Existing state-of-the-art methods employ fixed loss functions for this purpose and ignore the different levels of information that need to be transferred for different samples. In addition to that, these methods are also inefficient in terms of data usage.CCKD addresses these issues by leveraging the confidence assigned by the teacher model to the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 20 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?