2017
DOI: 10.3906/elk-1503-167
|View full text |Cite
|
Sign up to set email alerts
|

A comparative analysis of classification methods for hyperspectral images generated with conventional dimension reduction methods

Abstract: This paper compared performances of classification methods for a hyperspectral image dataset in view of dimensionality reduction (DR). Among conventional DR methods, principal component analysis, maximum noise fraction, and independent component analysis were used for the purpose of dimension reduction. The study was conducted using these DR techniques on a real hyperspectral image, an AVIRIS dataset with 224 bands, throughout the experiments. It was observed that DR may have a significant effect on the classi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 16 publications
0
3
0
Order By: Relevance
“…The scree test is a criterion used as an optimal method for reducing the dimensionality band from the hyperspectral dataset [6]. From the scree plot, it is shown that the first nine principal components are retained and the bands corresponding to these components provide large data variance when compared with the low-order PC.…”
Section: Interpretation Of Eigenvalues Presented Ina Scree Plotmentioning
confidence: 99%
See 1 more Smart Citation
“…The scree test is a criterion used as an optimal method for reducing the dimensionality band from the hyperspectral dataset [6]. From the scree plot, it is shown that the first nine principal components are retained and the bands corresponding to these components provide large data variance when compared with the low-order PC.…”
Section: Interpretation Of Eigenvalues Presented Ina Scree Plotmentioning
confidence: 99%
“…Classification results are greatly influenced by MNF when the ground cover features exhibit homogeneity. The efficiency of the classifiers improves with reduced components that reveal apparently informative bands [5,6]. This paper compares and evaluates the performance of two defined DR methods, namely PCA and MNF, by interpreting the eigen values acquired during processing.…”
Section: Introductionmentioning
confidence: 99%
“…In this context, in a first study, Ladrón de Guevara & Torra (2014) estimated the underlying structure of systematic risk by using Principal Component Analysis (PCA) and Factor Analysis (FA) 2 ; it included the testing of the models in two versions: returns and returns over the riskless interest rate for weekly and daily databases, and a two-stage methodology for the econometric contrast. First, they extracted the underlying systematic risk factors using both the standard linear version of Principal Component Analysis and the maximum likelihood Factor Analysis estimation, and they were able to reconstruct the observed returns using the factors extracted almost perfectly in all cases.…”
Section: Introductionmentioning
confidence: 99%