2014
DOI: 10.1109/tgrs.2013.2241773
|View full text |Cite
|
Sign up to set email alerts
|

Nearest Regularized Subspace for Hyperspectral Classification

Abstract: Abstract-A classifier that couples nearest-subspace classification with a distance-weighted Tikhonov regularization is proposed for hyperspectral imagery. The resulting nearest-regularizedsubspace classifier seeks an approximation of each testing sample via a linear combination of training samples within each class. The class label is then derived according to the class which best approximates the test sample. The distance-weighted Tikhonov regularization is then modified by measuring distance within a localit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
79
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
5
4

Relationship

3
6

Authors

Journals

citations
Cited by 213 publications
(80 citation statements)
references
References 31 publications
1
79
0
Order By: Relevance
“…In hyperspectral image analysis, the wealthy spectral information at the cost of high spectral dimensionality can better classify the materials in an observed area. However, high dimensionality leads to the curse of the dimensionality problem, which causes classification performance to deteriorate, especially when the number of available labeled training samples is limited [1][2][3][4][5][6].…”
Section: Introductionmentioning
confidence: 99%
“…In hyperspectral image analysis, the wealthy spectral information at the cost of high spectral dimensionality can better classify the materials in an observed area. However, high dimensionality leads to the curse of the dimensionality problem, which causes classification performance to deteriorate, especially when the number of available labeled training samples is limited [1][2][3][4][5][6].…”
Section: Introductionmentioning
confidence: 99%
“…The proposed methods, NFLE [20,21], KNFLE, FNFLE [26], and FKNFLE, were compared with two state-of-the-art algorithms, i.e., nearest regularized subspace (NRS) [25] and NRS-LFDA [25]. The parameter configurations for both algorithms NRS [29] and NRS-LFDA were as seen in [25].…”
Section: Classification Resultsmentioning
confidence: 99%
“…Using the technique in Section 3, 15 bands are generated with multiplication, and another 15 bands are generated with division. There are, in total, 16 different classes from the original ground truth; however, we select eight classes from the original dataset from a statistic viewpoint [5]. The eight classes we used in the experiments are Corn-no-till, Corn-min-till, Grass-pasture, Hay-windowed, Soybean-no-till, Soybean-min-till, Soybean-clean, and woods.…”
Section: Data Description and Experimental Setupmentioning
confidence: 99%
“…Instead, in these methods, a testing pixel is classified based on representation residual using labeled samples. The nearest regularized subspace (NRS) [5] is an improved version of CRC, where samples similar to the testing pixels are allowed to have high weights in the representation. Other variants of SRC or CRC have been proposed for hyperspectral imagery.…”
Section: Introductionmentioning
confidence: 99%