2017
DOI: 10.1016/j.jvcir.2017.02.002
|View full text |Cite
|
Sign up to set email alerts
|

Representative band selection for hyperspectral image classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 51 publications
(24 citation statements)
references
References 38 publications
0
23
0
Order By: Relevance
“…Many hyperspectral data analysis techniques use dimensionality reduction as a pre-processing step, which aims to remove the redundant spectral information while preserving only critical information for subsequent processing. Hyperspectral dimensionality reduction can be achieved through feature extraction or band selection [44], [45]. Feature extraction methods like Principal Component Analysis (PCA) and Minimum Noise Fraction (MNF) transform the original data into reduced feature spaces by means of different criteria, whereas band selection aims to select a small subset of hyperspectral bands to reduce the burden of heavy computations [46].…”
Section: Problem Statement and Proposed Algorithmmentioning
confidence: 99%
“…Many hyperspectral data analysis techniques use dimensionality reduction as a pre-processing step, which aims to remove the redundant spectral information while preserving only critical information for subsequent processing. Hyperspectral dimensionality reduction can be achieved through feature extraction or band selection [44], [45]. Feature extraction methods like Principal Component Analysis (PCA) and Minimum Noise Fraction (MNF) transform the original data into reduced feature spaces by means of different criteria, whereas band selection aims to select a small subset of hyperspectral bands to reduce the burden of heavy computations [46].…”
Section: Problem Statement and Proposed Algorithmmentioning
confidence: 99%
“…Then, we randomly choose 10% of the data to train the SVM classifier in all the methods and obtain the classification accuracy of all methods in different selected bands. Second, after band selection was performed by all the methods, according to [49], [50] which used 20% of data to train SVM, we also choose another 20% of the data to train a new SVM classifier compared with STMIGR [49], improved sparse subspace clustering (ISSC) [50], and WaLuDi, and our methods included GradHM+AS, GradHM+TS, Guided-GradHM+AS, and Guided-GradHM+TS. Third, the CNN model for the Indian Pines data should be trained by different proportions samples to obtain GradHM and Guided-GradHM.…”
Section: B Experimental Setupmentioning
confidence: 99%
“…Some algorithms have been reviewed in [10], where the hyperspectral bands are ranked by measures such as the Shannon entropy or spectral derivatives. Other approaches include band clustering 35 using various similarity measures and selecting representatives [11,12]. Popular similarity measures include information theoretical measures [13] or the correlation coefficient [14].…”
Section: Accepted Manuscriptmentioning
confidence: 99%