1986
DOI: 10.1090/s0002-9939-1986-0848890-5
|View full text |Cite
|
Sign up to set email alerts
|

An extended Čencov characterization of the information metric

Abstract: Abstract.Cencov has shown that Riemannian metrics which are derived from the Fisher information matrix are the only metrics which preserve inner products under certain probabilistically important mappings. In Cencov's theorem, the underlying differentiable manifold is the probability simplex E"x, = 1, x¡ > 0. For some purposes of using geometry to obtain insights about probability, it is more convenient to regard the simplex as a hypersurface in the positive cone. In the present paper Cencov's result is extend… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
93
0

Year Published

2003
2003
2023
2023

Publication Types

Select...
6
2
2

Relationship

0
10

Authors

Journals

citations
Cited by 67 publications
(94 citation statements)
references
References 4 publications
1
93
0
Order By: Relevance
“…Our choice was driven by the unique invariance properties (e.g. reparametrization invariance) of the Fisher information matrix and the Fisher kernel [11,36,55]. Applying Cholesky decomposition, the kernel can be defined as a simple scalar product, as…”
Section: Visual Novelty Via Fisher Informationmentioning
confidence: 99%
“…Our choice was driven by the unique invariance properties (e.g. reparametrization invariance) of the Fisher information matrix and the Fisher kernel [11,36,55]. Applying Cholesky decomposition, the kernel can be defined as a simple scalar product, as…”
Section: Visual Novelty Via Fisher Informationmentioning
confidence: 99%
“…In the classical setting, it is known that except for an overall multiplicative constant the classical Fisher information metric is unique [22,23]: it is the only monotone Riemannian metric with the property of having its line element reduced under Markov morphisms (stochastic maps). Said otherwise, there is essentially one classical statistical distance quantifying the classical distinguishability between two probability distributions.…”
Section: On Information Geometry and Statistical Distinguishabilitymentioning
confidence: 99%
“…In the classical setting, it is known that except for an overall multiplicative constant the classical Fisher information metric is unique [6]: it is the only monotone Riemannian metric with the property of having its line element reduced under Markov morphisms (stochastic maps). In other words, there is essentially one classical statistical distance quantifying the classical distinguishability between two probability distributions.…”
Section: The Wigner-yanase Quantum Informationmentioning
confidence: 99%