2023
DOI: 10.1109/tcsvt.2022.3222013
|View full text |Cite
|
Sign up to set email alerts
|

Deep Cross-Layer Collaborative Learning Network for Online Knowledge Distillation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(10 citation statements)
references
References 52 publications
0
10
0
Order By: Relevance
“…This design can improve the predictive power of compact models. [20]- [24] all design student-classmate ensemble training framework to obtain knowledge of ensemble teacher, which can guide both student and classmate efficiently in an end-to-end manner. AFID [65] directly employs one more complete sub-net to construct a two-branch ensemble training network.…”
Section: B Knowledge Distillation Guided Training Frameworkmentioning
confidence: 99%
See 4 more Smart Citations
“…This design can improve the predictive power of compact models. [20]- [24] all design student-classmate ensemble training framework to obtain knowledge of ensemble teacher, which can guide both student and classmate efficiently in an end-to-end manner. AFID [65] directly employs one more complete sub-net to construct a two-branch ensemble training network.…”
Section: B Knowledge Distillation Guided Training Frameworkmentioning
confidence: 99%
“…In this paper, we select ResNet [20], [22]- [24], [51], [59]- [61], [63]. Specifically, KD-ONE [22], OEM [20], DCCL [24], AFID [65] and PCL [66] are recent notable methods which combines knowledge distillation strategy with ensemble learning. All these three methods are studentclassmate training framework (Fig.…”
Section: ) Classification On Lightweight Baseline Modelsmentioning
confidence: 99%
See 3 more Smart Citations