2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW) 2019
DOI: 10.1109/iccvw.2019.00558
|View full text |Cite
|
Sign up to set email alerts
|

An Adversarial Approach to Discriminative Modality Distillation for Remote Sensing Image Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(5 citation statements)
references
References 37 publications
0
5
0
Order By: Relevance
“…In contrast, distillation loss incorporates the knowledge transfer process into the main downstream task. Additionally, architectures utilizing hallucination streams often require multistage training rather than an end-to-end approach [82], [93], [94]. Variations of CNN architecture dominate in this category.…”
Section: B Knowledge Distillationmentioning
confidence: 99%
See 2 more Smart Citations
“…In contrast, distillation loss incorporates the knowledge transfer process into the main downstream task. Additionally, architectures utilizing hallucination streams often require multistage training rather than an end-to-end approach [82], [93], [94]. Variations of CNN architecture dominate in this category.…”
Section: B Knowledge Distillationmentioning
confidence: 99%
“…In Pande et al [82], GANs are integrated into a teacher-student training paradigm. On the other hand, the works of Wei et al [84] and Li et al [83] underscore the utility of hallucination branches.…”
Section: Remote Sensingmentioning
confidence: 99%
See 1 more Smart Citation
“…A knowledge distillation framework is proposed in [46], which makes the output of the student and teacher models match. The discriminative modality distillation approach is introduced in [47], the teacher is trained on multimodal data, and then, the student model learns from the teacher model to improve the performance of the remote sensing image classifications. To address the problem of network overfitting due to noisy data, a novel noisy label distillation method (NLD) is proposed in [48].…”
Section: Knowledge Distillation and Sdmentioning
confidence: 99%
“…A knowledge distillation framework is proposed in [47], which makes the output of the student and teacher models match. Discriminative modality distillation approach is introduced in [48], the teacher is trained on multimodal data and then the student model learns from the teacher model to improve the performance of the RS image classifiction. To address the problem of network overfitting due to noisy data, a novel noisy label distillation method (NLD) is proposed in [49].…”
Section: Knowledge Distillation and Self-distillationmentioning
confidence: 99%