2023
DOI: 10.1007/978-981-99-8148-9_18
|View full text |Cite
|
Sign up to set email alerts
|

Dy-KD: Dynamic Knowledge Distillation for Reduced Easy Examples

Cheng Lin,
Ning Jiang,
Jialiang Tang
et al.
Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 16 publications
0
1
0
Order By: Relevance
“…There exists extensive literature studying knowledge distillation (Jiao et al 2020;Wang et al 2020;Gou et al 2021;Wu et al 2022;Ren et al 2023;Ji et al 2023;Li et al 2023). DeiT (Touvron et al 2021) introduces a distillation token to allow the vision transformer to learn from a ConvNet teacher.…”
Section: Knowledge Distillationmentioning
confidence: 99%
“…There exists extensive literature studying knowledge distillation (Jiao et al 2020;Wang et al 2020;Gou et al 2021;Wu et al 2022;Ren et al 2023;Ji et al 2023;Li et al 2023). DeiT (Touvron et al 2021) introduces a distillation token to allow the vision transformer to learn from a ConvNet teacher.…”
Section: Knowledge Distillationmentioning
confidence: 99%