Twenty-First International Conference on Machine Learning - ICML '04 2004
DOI: 10.1145/1015330.1015417
|View full text |Cite
|
Sign up to set email alerts
|

A kernel view of the dimensionality reduction of manifolds

Abstract: kernel view of the dimensionality reduction of manifolds", . July 2004.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
267
0
1

Year Published

2004
2004
2019
2019

Publication Types

Select...
4
3
3

Relationship

1
9

Authors

Journals

citations
Cited by 404 publications
(270 citation statements)
references
References 7 publications
2
267
0
1
Order By: Relevance
“…In this section we interpret the algorithm in terms of random walks inspired by (Ham et al, 2003). We will see that this method simply classifies the points by comparing a specific distance measure between them and the labeled points of different classes.…”
Section: Lazy Random Walksmentioning
confidence: 99%
“…In this section we interpret the algorithm in terms of random walks inspired by (Ham et al, 2003). We will see that this method simply classifies the points by comparing a specific distance measure between them and the labeled points of different classes.…”
Section: Lazy Random Walksmentioning
confidence: 99%
“…Embedding is then obtained through solving an eigenvalue problem on such matrix. It was shown in [25,26] that these approaches are all instances of kernel-based learning, in particular kernel principle component analysis KPCA [27]. Several approaches have been proposed to embed new data points, denoted be out of sample embedding, e.g.…”
Section: Manifold Representationsmentioning
confidence: 99%
“…This inspires the use of kernel machine, which explores the non-linearity of the data space. The extended nonlinear alternative, KPCA [19,23] and KDA [20], are used.…”
Section: Subspace Projectionmentioning
confidence: 99%