2024
DOI: 10.1111/exsy.13593
|View full text |Cite
|
Sign up to set email alerts
|

DFEF: Diversify feature enhancement and fusion for online knowledge distillation

Xingzhu Liang,
Jian Zhang,
Erhu Liu
et al.

Abstract: Traditional knowledge distillation relies on high‐capacity teacher models to supervise the training of compact student networks. To avoid the computational resource costs associated with pretraining high‐capacity teacher models, teacher‐free online knowledge distillation methods have achieved satisfactory performance. Among these methods, feature fusion methods have effectively alleviated the limitations of training without the strong guidance of a powerful teacher model. However, existing feature fusion metho… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 39 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?