2018
DOI: 10.3390/rs10101564
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Feature Selection Based on Ultrametricity and Sparse Training Data: A Case Study for the Classification of High-Dimensional Hyperspectral Data

Abstract: In this paper, we investigate the potential of unsupervised feature selection techniques for classification tasks, where only sparse training data are available. This is motivated by the fact that unsupervised feature selection techniques combine the advantages of standard dimensionality reduction techniques (which only rely on the given feature vectors and not on the corresponding labels) and supervised feature selection techniques (which retain a subset of the original set of features). Thus, feature selecti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 19 publications
(8 citation statements)
references
References 74 publications
0
8
0
Order By: Relevance
“…Wang et al, 2017). This allows gaining predictive ability and retaining meaningful features with respect to a given task (Bradley et al, 2018;Xue & Su, 2017). Moreover, some studies showed that SVM approaches are sensitive to data set dimensionality reduction (Gidudu & Heinz, 2007;Jain, 1997;Pal & Foody, 2010).…”
Section: State Of the Artmentioning
confidence: 99%
“…Wang et al, 2017). This allows gaining predictive ability and retaining meaningful features with respect to a given task (Bradley et al, 2018;Xue & Su, 2017). Moreover, some studies showed that SVM approaches are sensitive to data set dimensionality reduction (Gidudu & Heinz, 2007;Jain, 1997;Pal & Foody, 2010).…”
Section: State Of the Artmentioning
confidence: 99%
“…In addition to the aforementioned progress, the use of hyperspectral data has also come into the focus of research on environmental mapping over years (Plaza et al, 2009;Camps-Valls et al, 2014), as such information e.g. allows distinguishing different types of vegetation (Bradley et al, 2018; and different materials (Ilehag et al, 2017). This, in turn, can be helpful if the corresponding geometric structure of the observed scene is similar.…”
Section: Related Workmentioning
confidence: 99%
“…When using high-dimensional hyperspectral data, however, the high degree of redundancy contained in these data typically decreases the predictive accuracy of a classifier (Melgani and Bruzzone, 2004;Bradley et al, 2018), so that approaches for dimensionality reduction or band selection are commonly involved. In this context, dimensionality reduction techniques focus on deriving a new data representation based on fewer, but potentially better features extracted from the given data representation.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…However, the Hughes phenomenon [35], where a large number of bands with narrow intervals lead to high correlation between adjacent bands and redundant information, which interferes with classification is a major issue that affects high spectral dimensions. Therefore, many studies tried to reduce the number bands of hyperspectral remote sensing imagery, but with little loss of information to address this "dimensionality disaster" [17,36,37]. Using GI algorithms to search for the optimal combination of bands is one state-of-the art approach towards dimension reduction [38].…”
Section: Introductionmentioning
confidence: 99%