2001
DOI: 10.1109/97.895369
|View full text |Cite
|
Sign up to set email alerts
|

Kernel principal component analysis for texture classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
18
0

Year Published

2004
2004
2016
2016

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 39 publications
(18 citation statements)
references
References 2 publications
0
18
0
Order By: Relevance
“…KPCA is an improved approach based on PCA for nonlinear dimension reduction [8,10]. It operates through projecting signal sample x from n to N (N > n) dimensional space by an unknown mapping I, which makes the new function I(x) linearly separable in the N dimensional space.…”
Section: A Kpca Theorymentioning
confidence: 99%
“…KPCA is an improved approach based on PCA for nonlinear dimension reduction [8,10]. It operates through projecting signal sample x from n to N (N > n) dimensional space by an unknown mapping I, which makes the new function I(x) linearly separable in the N dimensional space.…”
Section: A Kpca Theorymentioning
confidence: 99%
“…Recently, nonlinear scalespace approaches, such as kernel principle component analysis (KPCA) [26] and Laplacian Eigenmaps [27~29], have been considered as more efficient ways for discriminant information extraction. In this paper, we propose a reference frame subspace approach using Laplacian Eigenmap, which selects a limited number of reference frames to form a Laplacian Eigen subspace to measure the inter-frame dissimilarity.…”
Section: Fig 1 a Three Layer Video Abstraction Systemmentioning
confidence: 99%
“…Recent non-linear scale-space approaches [26][27][28][29][30] have recently been intensively researched. Among these approaches, Laplacian Eigenmap [27][28][29][30] has been considered among the best ways that can outperform the traditional linear SVD or LSA approaches.…”
Section: A Laplacian Eigenmapmentioning
confidence: 99%
“…Examples of successful extensions in feature extraction include kernel principal component analysis (KPCA) [3], kernel discriminant analysis (KDA) [4,2,5], kernel-based orthogonal subspace projection [6], kernel Foley-Sammon optimal discriminant vectors [7], and kernel-based matched subspace detectors [8]. In addition to feature extraction, the kernel trick has also been applied to learning machines [9][10][11] and other applications, [12][13][14], such as blind source separation [15] and object tracking [16][17][18][19][20], to increase discrimination between the objects of interest. To evaluate the quality of a kernel space, a pool of useful distance measures have been derived and extended to their kernel versions, including Chernoff distance, Bhattacharyya distance, Kullback-Leibler divergence, Patrick-Fisher distance and Mahalanobis distance [21,22].…”
Section: Introductionmentioning
confidence: 99%