2010
DOI: 10.2478/v10175-010-0019-1
|View full text |Cite
|
Sign up to set email alerts
|

Information geometry of divergence functions

Abstract: Abstract. Measures of divergence between two points play a key role in many engineering problems. One such measure is a distance function, but there are many important measures which do not satisfy the properties of the distance. The Bregman divergence, KullbackLeibler divergence and f -divergence are such measures. In the present article, we study the differential-geometrical structure of a manifold induced by a divergence function. It consists of a Riemannian metric, and a pair of dually coupled affine conne… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
152
0

Year Published

2010
2010
2022
2022

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 107 publications
(155 citation statements)
references
References 25 publications
3
152
0
Order By: Relevance
“…An insight using information geometry may further elucidates the fundamental structures of such divergences and geometry [1,3,7].…”
Section: Conclusion and Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…An insight using information geometry may further elucidates the fundamental structures of such divergences and geometry [1,3,7].…”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…The divergences are closely related to the invariant geometrical properties of the manifold of probability distributions [5][6][7].…”
Section: D(p || Z) ≤ D(p || Q) + D(q || Z) (Subaddivity/triangle Ineqmentioning
confidence: 99%
“…The proof of equivalence uses concepts in convex analysis combined with connections between Bregman divergences and Riemannian manifolds developed in [2]. Using the equivalence of the two algorithms, we can exploit the desireable properties of both algorithms.…”
Section: Our Contributionmentioning
confidence: 99%
“…In particular, let G : Θ → R denote a strictly convex twice-differentiable function, the divergence introduced by [7] B G : Θ × Θ → R + is: Bregman divergences are widely used in statistical inference, optimization, machine learning, and information geometry (see e.g. [2,5]). Letting Ψ(·, ·) = B G (·, ·), the mirror descent step defined is:…”
Section: Mirror Descent With Bregman Divergencesmentioning
confidence: 99%
“…But J. Burbea and C. R. Rao have proved in [18,Theorem 2] that the Bergman metric and the Fisher information metric do coincide for some probability density functions of particular forms. A similar potential function was used by S. Amari in [2] to derive the Riemannian metric of multi-variate Gaussian distributions by means of divergence functions. We refer to [29] for more account on the geometry of Hessian structures.…”
Section: Riemannian Geometry Of Toeplitz Covariance Matricesmentioning
confidence: 99%