2019
DOI: 10.1109/jstars.2019.2899157
|View full text |Cite
|
Sign up to set email alerts
|

Semi-Supervised Hyperspectral Band Selection Based on Dynamic Classifier Selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
12
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 33 publications
(12 citation statements)
references
References 45 publications
0
12
0
Order By: Relevance
“…In order to validate the superiority of our proposed RMGF, five other unsupervised hyperspectral band selection methods are used for comparison, they are as follows: 1) UBS [Chang and Wang, 2006], which divides the hyperspectral image cube into multiple subcubes at equal width based on the required number of selected bands, each segmentation point is viewed as the representative band. 2) TOF [Wang et al, 2018], which aims to construct an optimal clustering model with rank constraint to provide an effective criterion for selecting bands on existing clustering structure.…”
Section: Compared Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…In order to validate the superiority of our proposed RMGF, five other unsupervised hyperspectral band selection methods are used for comparison, they are as follows: 1) UBS [Chang and Wang, 2006], which divides the hyperspectral image cube into multiple subcubes at equal width based on the required number of selected bands, each segmentation point is viewed as the representative band. 2) TOF [Wang et al, 2018], which aims to construct an optimal clustering model with rank constraint to provide an effective criterion for selecting bands on existing clustering structure.…”
Section: Compared Methodsmentioning
confidence: 99%
“…Based on the availability of sample labels, existing band selection methods can be categorized into supervised ones [Feng et al, 2014;Cao et al, 2019] and unsupervised ones [Jia et al, 2012;Yuan et al, 2016]. For supervised methods, they need sample labels to train a classifier to select the most optimal bands.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…As for wrapper methods, the feature selection and classification are separated, and conducted iteratively. Typical examples are dynamic classifier [12], hybrid whale optimization algorithm with simulated annealing (WOASA) [13], modified ant lion optimizer (MALO) [14]. In the embedded methods, the feature selection and classification are unified into one model.…”
Section: Introductionmentioning
confidence: 99%
“…However, there are some HSIs which lack labeled samples, because labeling samples is a costly, time consuming, and labor-intensive task. On these HSIs with a small number of labeled samples, unsupervised learning and semi-supervised learning have played huge roles [12], [18], [27]. As the number of HSIs increases, it can be found that many HSIs are related.…”
Section: Introductionmentioning
confidence: 99%