2016
DOI: 10.1016/j.acha.2015.07.005
|View full text |Cite
|
Sign up to set email alerts
|

Diffusion-based kernel methods on Euclidean metric measure spaces

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
2
1
1
1

Relationship

3
2

Authors

Journals

citations
Cited by 10 publications
(15 citation statements)
references
References 28 publications
0
15
0
Order By: Relevance
“…While several similarity kernels are used in practice to construct the diffusion operator P, a standard choice is the Gaussian affinity k ε (x, y) = exp(− x − y 2 /ε) [13,[82][83][84], in which case we denote the diffusion operator P ε , where ε determines the neighborhood radius. This kernel choice is often seen in theoretical and mathematical work due to its established properties on data sampled from locally low dimensional geometries (i.e., data manifolds) [82,85,86].…”
Section: Diffusion Information Geometry For Visualization and Condensmentioning
confidence: 99%
“…While several similarity kernels are used in practice to construct the diffusion operator P, a standard choice is the Gaussian affinity k ε (x, y) = exp(− x − y 2 /ε) [13,[82][83][84], in which case we denote the diffusion operator P ε , where ε determines the neighborhood radius. This kernel choice is often seen in theoretical and mathematical work due to its established properties on data sampled from locally low dimensional geometries (i.e., data manifolds) [82,85,86].…”
Section: Diffusion Information Geometry For Visualization and Condensmentioning
confidence: 99%
“…For simplicity, the integration notation ∫ ⋅ dy in this paper will refer to the Lebesgue integral ∫ M ⋅ dy over the manifold, instead of the whole space R n . Further, while (for simplicity) such integrals are written without a specific measure one can equivalently, w.l.o.g., replace dx with an appropriate measure representing data sampling distribution over M. Let g(x, y) ≜ exp −∥x − y∥ 2 /ε , x, y ∈ M, ε > 0, define the Gaussian kernel Gf (x) = ∫ g(x, y)f (y)dy used in [3] to capture local neighborhoods from data sampled from M. Following [3] and related work, we define the Gaussian degree q(x) = ∥g(x, ⋅)∥ 1 = ∫ g(x, y)dy and assume it provides a suitable approximation of the distribution (or local density) of data over the manifold M. Finally, given a measure µ over the manifold, an MGC kernel [1], [2] is defined as k µ (x, y) = ∫ g(x, r)g(y, r)dµ(r). Note that while we use a Gaussian kernel for the remainder of this work, the definitions and theorems to follow do not depend on the choice of g, so long as it is a kernel function.…”
Section: A Preliminariesmentioning
confidence: 99%
“…Note that while we use a Gaussian kernel for the remainder of this work, the definitions and theorems to follow do not depend on the choice of g, so long as it is a kernel function. The original MGC construction in [2] considered measures that represent data distribution, and used the constructed MGC kernel to define diffusion maps (see Sec. II-B and [3]) with them.…”
Section: A Preliminariesmentioning
confidence: 99%
“…The measure-based Gaussian Correlation (MGC) kernel [4,3] defines the affinities between elements in X, which in this context, it is referred to as the analyzed domain via their relations with the reference dataset M that is referred to as the measure domain. This framework enables to have a flexible representation of X, as long as M , which in some sense characterizes the data, is sufficiently large.…”
Section: Measure-based Gaussian Correlation Kernelmentioning
confidence: 99%
“…In this paper, we focus on deriving a representation that preserves the diffusion distances between multidimensional data points based on the MGC framework [4,3]. This representation is applicable to process efficiently very large datasets by imposing a Markovian diffusion process to define and represent the non-linear relations between multidimensional data points.…”
Section: Introductionmentioning
confidence: 99%