2002
DOI: 10.1109/msp.2002.1028355
|View full text |Cite
|
Sign up to set email alerts
|

Applications of entropic spanning graphs

Abstract: Abstract-This paper presents applications of entropic spanning graphs to imaging and feature clustering applications. Entropic spanning graphs span a set of feature vectors in such a way that the normalized spanning length of the graph converges to the entropy of the feature distribution as the number of random feature vectors increases. This property makes these graphs naturally suited to applications where entropy and information divergence are used as discriminants including: texture classification; feature… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
192
0
1

Year Published

2005
2005
2011
2011

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 197 publications
(194 citation statements)
references
References 50 publications
1
192
0
1
Order By: Relevance
“…When λ tends to 1, by l'Hospital's rule, H λ converges to the Shannon entropy that will thus be denoted by continuity H 1 = H. The Rényi entropy is widely used, not only in physics (e.g. statistical mechanics, physics of turbulence, cosmology, see [10,11,12] and references therein), but in various other areas such as in signal processing (time scale analysis, decision problems, machine learning, see [4,13,14,15] and references therein), or image processing (image matching, image registration see [16,17] and references therein). …”
Section: The Rényi Entropy Uncertainty Relationmentioning
confidence: 99%
See 1 more Smart Citation
“…When λ tends to 1, by l'Hospital's rule, H λ converges to the Shannon entropy that will thus be denoted by continuity H 1 = H. The Rényi entropy is widely used, not only in physics (e.g. statistical mechanics, physics of turbulence, cosmology, see [10,11,12] and references therein), but in various other areas such as in signal processing (time scale analysis, decision problems, machine learning, see [4,13,14,15] and references therein), or image processing (image matching, image registration see [16,17] and references therein). …”
Section: The Rényi Entropy Uncertainty Relationmentioning
confidence: 99%
“…(16) is straightforward from (15), (7) and (11) while (17) is a consequence of (12) and [28, 9-28 (22)] (or [27, 8.5 (20)] or [29, 6.565-4]). Finally, plugging (16) and (17) into (8) and using [29, 8.380-3] and q = p p−1 , the sum of the entropy rates (18) for p = 2 follows. In the Shannon case, starting from (1 + r 2 ) n+m 2 dr as a beta integral and using [29, 6.576-4] to evaluate +∞ 0 r −λ K 2 ν (r) dr, we notice that h(r) λ log(h(r)) = ∂ ∂λ h(r) λ (with h(r) = r and h(r) = 1 + r 2 ) to finally obtain (19).…”
Section: The General Student-t Casementioning
confidence: 99%
“…Note that in such a minimum spanning tree, the mixed dataset samples that share edges only with other mixed dataset samples would be the ones that reduce a graph-theoretic estimate of the Henze-Penrose affinity between the datasets [12,16,13]. Note also that this strategy produces only a single false alarm rate and a single detection rate as the detection rule cannot be varied by changing a threshold as in the previous two cases, though a receiver operating characteristics curve can be constructed by joining the paired false alarm and detection rates with the origin on one side, and paired unit false alarm and detection rates on the other.…”
Section: Detection Performance On Synthetic Datamentioning
confidence: 99%
“…The -Jensen index function has been independently proposed by Ma [3] and He et al [6] for image registration problems. Let ¢ ¤ £ and ¢ ¦ ¥ be two densities and § © [0, 1] be a mixture parameter.…”
Section: ¡ -Jensen Difference Functionmentioning
confidence: 99%
“…The overall length of the MST can be used to construct a strongly consistent estimator of Lebesgue continuous densities [3].…”
Section: Minimum Spanning Tree and Renyi Entropymentioning
confidence: 99%