2021
DOI: 10.3390/rs13245009
|View full text |Cite
|
Sign up to set email alerts
|

Weakly Supervised Classification of Hyperspectral Image Based on Complementary Learning

Abstract: In recent years, supervised learning-based methods have achieved excellent performance for hyperspectral image (HSI) classification. However, the collection of training samples with labels is not only costly but also time-consuming. This fact usually causes the existence of weak supervision, including incorrect supervision where mislabeled samples exist and incomplete supervision where unlabeled samples exist. Focusing on the inaccurate supervision and incomplete supervision, the weakly supervised classificati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 62 publications
0
1
0
Order By: Relevance
“…[54] proposed a novel adaptive multi-feature collaborative representation classifier to correct the labels of uncertain samples. (4) Hybrid approach [21,23,26,51,[55][56][57][58]. Both [51] and [55] introduced unsupervised method into label noise leaning.…”
Section: Deep Neural Network-based Label Noise Learning In Remote Sen...mentioning
confidence: 99%
See 2 more Smart Citations
“…[54] proposed a novel adaptive multi-feature collaborative representation classifier to correct the labels of uncertain samples. (4) Hybrid approach [21,23,26,51,[55][56][57][58]. Both [51] and [55] introduced unsupervised method into label noise leaning.…”
Section: Deep Neural Network-based Label Noise Learning In Remote Sen...mentioning
confidence: 99%
“…In [55], an unsupervised method was combined with domain adaptation for HSI classification. In addition, complementary learning was combined with deep learning for HSI classification [26] and RS scene classification [56]. Recently, Ref.…”
Section: Deep Neural Network-based Label Noise Learning In Remote Sen...mentioning
confidence: 99%
See 1 more Smart Citation
“…The phenomenon of small samples is an inherent characteristic of the mapping of the real world to the digital world, and hyperspectral data processing also faces such a challenge all the time. For the lack of samples, there are currently two main solutions: 1) data augmentation (sample expansion) [29], [30], [31], [32]; 2) adaptation to small sample learning [33], [34], [35]. Generally speaking, the larger the number of training samples in deep learning, the more representative the extracted data features, the smaller the sample size, and the less general the feature expression.…”
Section: Introductionmentioning
confidence: 99%
“…For instance, Melgani and Bruzzone (2004) against a traditional CNN model. However, acquiring thousands of accurate reference training samples using traditional field data collection methods such as the Global Navigation Satellite System (GNSS), may be impractical due to time constraints, logistical limitations, and terrain inaccessibility (Fang et al, 2020;Huang et al, 2021).…”
Section: Introductionmentioning
confidence: 99%