2023
DOI: 10.1007/s13042-023-01926-2
|View full text |Cite
|
Sign up to set email alerts
|

AFMPM: adaptive feature map pruning method based on feature distillation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 17 publications
0
0
0
Order By: Relevance
“…Feature distillation, a form of knowledge distillation, involves transferring intermediate-layer features from the teacher model to the student network for learning. This enables the student model to directly align with intermediate feature map information and acquire the feature extraction capabilities of the teacher network (Guo et al, Guo YF, Zhang WW, Wang JH et al (2024) AFMPM: adaptive feature map pruning method based on feature distillation. International Journal of Machine Learning and Cybernetics 15:573-588. https://doi.org/10.1007/s13042-023-01926-2).…”
Section: Introductionmentioning
confidence: 99%
“…Feature distillation, a form of knowledge distillation, involves transferring intermediate-layer features from the teacher model to the student network for learning. This enables the student model to directly align with intermediate feature map information and acquire the feature extraction capabilities of the teacher network (Guo et al, Guo YF, Zhang WW, Wang JH et al (2024) AFMPM: adaptive feature map pruning method based on feature distillation. International Journal of Machine Learning and Cybernetics 15:573-588. https://doi.org/10.1007/s13042-023-01926-2).…”
Section: Introductionmentioning
confidence: 99%