2005
DOI: 10.1007/11589990_59
|View full text |Cite
|
Sign up to set email alerts
|

Kernel Nonparametric Weighted Feature Extraction for Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
29
0

Year Published

2007
2007
2017
2017

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 23 publications
(29 citation statements)
references
References 6 publications
0
29
0
Order By: Relevance
“…In MFLDA as distinct from LDA, the total scatter matrix is used instead of the withinclass scatter matrix and class signatures are used as mean values of classes. Kuo and Landgrebe (Kuo & Landgrebe, 2004) propose a nonparametric weighted feature extraction method (NWFE). In this approach, different weights are computed for every sample.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In MFLDA as distinct from LDA, the total scatter matrix is used instead of the withinclass scatter matrix and class signatures are used as mean values of classes. Kuo and Landgrebe (Kuo & Landgrebe, 2004) propose a nonparametric weighted feature extraction method (NWFE). In this approach, different weights are computed for every sample.…”
Section: Introductionmentioning
confidence: 99%
“…The dimension of the hyperspectral data is reduced by using CGLDA. Kuo and Landgrebe emphasize the use of local information to improve in the discriminant analysis for feature extraction (Kuo & Landgrebe, 2004). Advantages of the usage of not only global pattern information, but also local pattern information are examined in hyperspectral image processing.…”
Section: Introductionmentioning
confidence: 99%
“…The Fisher discriminant [27] is used to rank the features based on the observations of the various samples. Feature extraction for high-dimensional data is also done using the nonparametric [28] methods. Here the rank is given for the entire samples in the dataset.…”
Section: International Journal Of Applied Information Systems (Ijais)mentioning
confidence: 99%
“…Yuan et al [36] utilized a hypergraph embedding model for HSI feature reduction, in which the spatial hypergraph models (SHs) are construed by selecting the K-nearest neighbors within the spatial region of the centroid pixel. Experimental results demonstrated that SH outperformed many existing feature extract methods for HSI classification, including raw spectral feature (RAW), PCA, LPP, LDA, nonparametric weighted feature extraction (NWFE) [37] and semi-supervised local discriminant analysis (SELD) [38]. However, SH is designed to learn the projection matrix for reducing the spectral feature.…”
Section: Introductionmentioning
confidence: 99%