2014
DOI: 10.1109/tpami.2014.2324585
|View full text |Cite
|
Sign up to set email alerts
|

Information Theoretic Shape Matching

Abstract: In this paper, we describe two related algorithms that provide both rigid and non-rigid point set registration with different computational complexity and accuracy. The first algorithm utilizes a nonlinear similarity measure known as correntropy. The measure combines second and high order moments in its decision statistic showing improvements especially in the presence of impulsive noise. The algorithm assumes that the correspondence between the point sets is known, which is determined with the surprise metric… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
34
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 66 publications
(34 citation statements)
references
References 28 publications
0
34
0
Order By: Relevance
“…Bessa et al adopted MCC to train neural networks for wind prediction in power system [34]. Hasanbelliu et al utilized information theoretic measures (entropy and correntropy) to develop two algorithms that can deal with both rigid and non-rigid point set registration with different computational complexities and accuracies [35]. However, constrained adaptive filtering based on MCC has not been studied yet in the literature.…”
Section: Introductionmentioning
confidence: 99%
“…Bessa et al adopted MCC to train neural networks for wind prediction in power system [34]. Hasanbelliu et al utilized information theoretic measures (entropy and correntropy) to develop two algorithms that can deal with both rigid and non-rigid point set registration with different computational complexities and accuracies [35]. However, constrained adaptive filtering based on MCC has not been studied yet in the literature.…”
Section: Introductionmentioning
confidence: 99%
“…Those two families of distances can be symmetrized and encompass both the Cauchy-Schwarz divergence and the family of skew Bhattacharyya divergences. Since the Cauchy-Schwarz divergence is often used in distribution clustering applications [22], we carried out preliminary experiments demonstrating experimentally that the symmetrized Hölder divergences improved over the Cauchy-Schwarz divergence for a toy dataset of Gaussians. We briefly touched upon the use of these novel divergences in statistical estimation theory.…”
Section: Discussionmentioning
confidence: 99%
“…Intuitively, the symmetric Hölder centroid with α and γ close to one has a smaller variance (see Figure 4); therefore, it can better capture the clustering structure. This hints that one should consider the general Hölder divergence to replace CS in similar clustering applications [22,38]. Although one faces the problem of tuning the parameter α and γ, Hölder divergences can potentially give better results.…”
Section: Clustering Based On Symmetric Hölder Divergencesmentioning
confidence: 99%
“…In this paper, registration is mainly based on information shape matching [12], but with major differences. The affine and non-rigid transformation are performed in consecutive but separate steps, with each step employing a different cost function.…”
Section: Affine and Non-rigid Transformationmentioning
confidence: 99%
“…Unlike MCC, MSE as a cost function is incapable of handling imperfect correspondence. Meanwhile, registration suffers from inadequate correspondence found by surprise [12] even with MCC. Rigid CPD is heavily dependent on initial position which is not desired.…”
Section: Point Set Registrationmentioning
confidence: 99%