2020 International Conference on Machine Vision and Image Processing (MVIP) 2020
DOI: 10.1109/mvip49855.2020.9116875
|View full text |Cite
|
Sign up to set email alerts
|

Class Attention Map Distillation for Efficient Semantic Segmentation

Abstract: In recent years, deep neural networks have achieved remarkable accuracy in computer vision tasks. With inference time being a crucial factor, particularly in dense prediction tasks such as semantic segmentation, knowledge distillation has emerged as a successful technique for improving the accuracy of lightweight student networks. The existing methods often neglect the information in channels and among different classes.To overcome these limitations, this paper proposes a novel method called Inter-Class Simila… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 54 publications
0
0
0
Order By: Relevance
“…The attention mechanism originated from the study of human vision. Various attention mechanisms [27][28][29][30][31] have been developed and significantly impacted the field of computer vision. Incorporating attention mechanisms into deep learning has enhanced the performance of various models in recent years.…”
Section: Attention Mechanismmentioning
confidence: 99%
“…The attention mechanism originated from the study of human vision. Various attention mechanisms [27][28][29][30][31] have been developed and significantly impacted the field of computer vision. Incorporating attention mechanisms into deep learning has enhanced the performance of various models in recent years.…”
Section: Attention Mechanismmentioning
confidence: 99%