Proceedings of the 2012 SIAM International Conference on Data Mining 2012
DOI: 10.1137/1.9781611972825.17
|View full text |Cite
|
Sign up to set email alerts
|

Heterogeneous datasets representation and learning using diffusion maps and Laplacian pyramids

Abstract: The diffusion maps together with the geometric harmonics provide a method for describing the geometry of high dimensional data and for extending these descriptions to new data points and to functions, which are defined on the data. This method suffers from two limitations. First, even though real-life data is often heterogeneous , the assumption in diffusion maps is that the attributes of the processed dataset are comparable. Second, application of the geometric harmonics requires careful setting for the corre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
64
0

Year Published

2012
2012
2020
2020

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 45 publications
(64 citation statements)
references
References 16 publications
(22 reference statements)
0
64
0
Order By: Relevance
“…If the CPU is insufficiently strong, computing the DM for the entire dataset X is difficult. Therefore, to solve the problem, a DM can be constructed by using random samples of X and then by extending it to all points using the Nyström extension method [9,20].…”
Section: The Ns + Dm Algorithm For Sss Image Target Detectionmentioning
confidence: 99%
“…If the CPU is insufficiently strong, computing the DM for the entire dataset X is difficult. Therefore, to solve the problem, a DM can be constructed by using random samples of X and then by extending it to all points using the Nyström extension method [9,20].…”
Section: The Ns + Dm Algorithm For Sss Image Target Detectionmentioning
confidence: 99%
“…From a general point of view LP is a multiscale algorithm for extending sample-based function values (in our case, the eigenvectors ψ j (x i )) that uses different scalings for different resolutions. More precisely, we can apply them [16] to approximate a function f from its values f (x k ) on a sample S = {x 1 , . .…”
Section: Laplacian Pyramidsmentioning
confidence: 99%
“…The first one is an extension to the non-symmetric transition matrix P of the classical Nyström formula [15] for symmetric, positive semidefinite matrices derived from a kernel that extends the eigenvectors of the sample kernel matrix to the eigenfunctions of the underlying integral operator. The second one, the Laplacian Pyramids algorithm [16], also relies on a kernel representation but starts from the discrete sample values f (x i ) (in our case, the eigenvectors of the sample based Markov transition matrix P ) of a certain function f (in our case, the general eigenfunctions), and seeks a multiscale representation of f that allows to approximate the values f (x) from an appropriate multiscale combination of the sample values f (x i ).…”
Section: Introductionmentioning
confidence: 99%
“…In [10] a method is proposed to adapt DM to work with heterogeneous features just by dealing separately with groups of attributes that are deemed to be homogeneous. More …”
Section: Algorithm 1 Diffusion Maps Algorithmmentioning
confidence: 99%
“…Moreover, a main drawback (as it also happens in spectral DR) is the difficulty to apply the computed DM projection to new, unseen patterns. There are several proposals for this such as Nyström formulae [4] or Laplacian Pyramids [10], but this is still an area where further work is needed.…”
Section: Algorithm 1 Diffusion Maps Algorithmmentioning
confidence: 99%