2008 3rd IEEE Conference on Industrial Electronics and Applications 2008
DOI: 10.1109/iciea.2008.4582482
|View full text |Cite
|
Sign up to set email alerts
|

The effect of features reduction on different texture classifiers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2009
2009
2022
2022

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 2 publications
0
2
0
Order By: Relevance
“…The analysis of hyperspectral imagery usually implicates the reduction of data set dimensionality to decrease the complexity of the classifier and the computational time required with the aim of preserving most of the relevant information of the original data according to some optimal or suboptimal criteria [3], [4]. The preprocessing procedure exploited in this section divides the hyperspectral signatures into adjacent regions of the spectrum and approximates their values by piecewise constant functions.…”
Section: A Reduction Of Data Dimensionalitymentioning
confidence: 99%
“…The analysis of hyperspectral imagery usually implicates the reduction of data set dimensionality to decrease the complexity of the classifier and the computational time required with the aim of preserving most of the relevant information of the original data according to some optimal or suboptimal criteria [3], [4]. The preprocessing procedure exploited in this section divides the hyperspectral signatures into adjacent regions of the spectrum and approximates their values by piecewise constant functions.…”
Section: A Reduction Of Data Dimensionalitymentioning
confidence: 99%
“…Eigenvalue Grads Method (EGM) [2] is a popular method for estimation the number of sources impinging on an array of sensors. The Principal Component Analysis (PCA) neural network [3][4] is widely used in applications ranging from neuroscience to signal processing to extract relevant information from high dimensional data sets. The PCA neural network has been found very useful for extracting the most representative low dimensional subspace from a high dimensional vector space.…”
Section: Introductionmentioning
confidence: 99%