2015
DOI: 10.3390/e17074918
|View full text |Cite
|
Sign up to set email alerts
|

Fisher Information Properties

Abstract: A set of Fisher information properties are presented in order to draw a parallel with similar properties of Shannon differential entropy. Already known properties are presented together with new ones, which include: (i) a generalization of mutual information for Fisher information; (ii) a new proof that Fisher information increases under conditioning; (iii) showing that Fisher information decreases in Markov chains; and (iv) bound estimation error using Fisher information. This last result is especially import… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 45 publications
(36 citation statements)
references
References 45 publications
0
36
0
Order By: Relevance
“…A similar inequality is derived in [24] for the mutual Fisher information. The mutual Fisher information is rather similar to V[E[u θ |e]] than G I θ .…”
Section: Monotonicity Of Asymptotic Efficiencymentioning
confidence: 71%
See 1 more Smart Citation
“…A similar inequality is derived in [24] for the mutual Fisher information. The mutual Fisher information is rather similar to V[E[u θ |e]] than G I θ .…”
Section: Monotonicity Of Asymptotic Efficiencymentioning
confidence: 71%
“…The mutual Fisher information is rather similar to V[E[u θ |e]] than G I θ . Theorem 13 of [24] corresponds to the one-dimensional version of the inequality…”
Section: Monotonicity Of Asymptotic Efficiencymentioning
confidence: 99%
“…So by replacing W and U A in (45), or equivalently P out (y|z) = (1 − )δ(y − π(z)) + δ(y + π(z)) into (44), one obtains the Shannon capacity of the BSC channel R ∞ pot = 1 − h 2 ( ) where h 2 is the binary entropy function. Using (43) this map also gives the algorithmic threshold…”
Section: B Binary Symmetric Channelmentioning
confidence: 99%
“…It is important to note that the term d J(I) ( f {S}n ; f {L}n;ψ,θ ) θ k corresponds to the relative information [21]. This term can be re-expressed as:…”
Section: The Kullback-leibler Divergence Is Usefulmentioning
confidence: 99%