2012
DOI: 10.1080/19479832.2012.702687
|View full text |Cite
|
Sign up to set email alerts
|

Classification of hyperspectral data using extended attribute profiles based on supervised and unsupervised feature extraction techniques

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
46
0

Year Published

2014
2014
2020
2020

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 56 publications
(46 citation statements)
references
References 26 publications
0
46
0
Order By: Relevance
“…Recently, some novel classification strategies have proposed for classifying hyperspectral images, such as random forest [35]. However, to evaluate the robustness of the deep features, the widely-used SVM classifier is considered as the benchmark in the experiment.…”
Section: Comparison Of Different Methodsmentioning
confidence: 99%
“…Recently, some novel classification strategies have proposed for classifying hyperspectral images, such as random forest [35]. However, to evaluate the robustness of the deep features, the widely-used SVM classifier is considered as the benchmark in the experiment.…”
Section: Comparison Of Different Methodsmentioning
confidence: 99%
“…The well-known transformation-based characteristics learning methods include principal component analysis (PCA), minimum noise fraction (MNF), etc. PCA [32] can express data in minimum mean square error, but it will be influenced by noise. Therefore, Green et al [30] and Lee et al [31] proposed minimum noise separation methods, which arrange the components of the transformation according to the order of signal-to-noise ratio (SNR).…”
Section: Related Workmentioning
confidence: 99%
“…Transformation-based feature learning methods [29][30][31][32] map or transfer the original data from the high-dimensional data space into the low-dimensional feature space. The well-known transformation-based characteristics learning methods include principal component analysis (PCA), minimum noise fraction (MNF), etc.…”
Section: Related Workmentioning
confidence: 99%
“…As a result, we can perform dimensionality reduction, using techniques such as PCA [14], and then perform attribute filtering on the first few PCs, as suggested in [27], in order to reduce computational complexity. For multispectral data, since there are only a few spectral bands available, we can perform attribute filtering on the full original spectral data.…”
Section: A Emapsmentioning
confidence: 99%
“…Although classifiers such as support vector machines (SVMs) [20], with reduced sensitivity to limited training samples, have been successfully used in the literature [6], [21]- [26], the information extracted by morphological approaches can be made more compact in order to further optimize the classification process. This issue was investigated in [16], [27], and [28] by applying several feature extraction and selection techniques to morphological profiles and APs prior to classification, aiming at reducing the high dimensionality of the profile by keeping only (few) relevant features. A compact representation of the profile was also proposed in [13] by defining the so-called morphological characteristic (MC), which derives from the DMP if the underlying region of each pixel is mostly darker or brighter with respect to its surroundings.…”
mentioning
confidence: 99%