1998
DOI: 10.1162/089976698300017115
|View full text |Cite
|
Sign up to set email alerts
|

Mutual Information, Fisher Information, and Population Coding

Abstract: In the context of parameter estimation and model selection, it is only quite recently that a direct link between the Fisher information and information-theoretic quantities has been exhibited. We give an interpretation of this link within the standard framework of information theory. We show that in the context of population coding, the mutual information between the activity of a large array of neurons and a stimulus to which the neurons are tuned is naturally related to the Fisher information. In the light o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

14
353
0

Year Published

2001
2001
2013
2013

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 304 publications
(367 citation statements)
references
References 20 publications
14
353
0
Order By: Relevance
“…The asymptotics can be found analytically by expanding both bounds (7) and (9) for q → 1. For a smooth potential V both bounds give asymptotically the same logarithmic growth R Bayes m /N ≃ 1/2 ln α which can also be obtained by well known asymptotic expansions involving the Fisher information matrix [22,23,11]. On the other hand, our bounds can also be used when these standard asymptotic expansions do not apply, e.g.…”
supporting
confidence: 68%
See 1 more Smart Citation
“…The asymptotics can be found analytically by expanding both bounds (7) and (9) for q → 1. For a smooth potential V both bounds give asymptotically the same logarithmic growth R Bayes m /N ≃ 1/2 ln α which can also be obtained by well known asymptotic expansions involving the Fisher information matrix [22,23,11]. On the other hand, our bounds can also be used when these standard asymptotic expansions do not apply, e.g.…”
supporting
confidence: 68%
“…We combine information theoretic bounds for the performance of statistical estimators (see e.g. [10,11,12]) with an elementary variational principle of statistical physics [13]. This will allow us to compute rigorous upper and lower bounds for the critical number of examples at which a transition occurs.…”
mentioning
confidence: 99%
“…The already mentioned low-noise approximation to MI is constructed by employing the Cramer-Rao bound [12,[21][22][23]]. Although we demonstrate that the high-noise approximation also involves FI, we never employ the Cramer-Rao bound and the appearance of FI is due to certain asymptotic properties of the KL distance [28].…”
Section: Measures Of Informationmentioning
confidence: 78%
“…(26), (2) compute the derivative of the log likelihood, Eq. (28), and (3) show that the maximum likelihood estimator is unbiased and efficient (i.e., it reaches the Cramer-Rao bound).…”
Section: Appendix B Correlated Neurons With Firing Rate Proportionalmentioning
confidence: 96%
“…We use the log of the covariance matrix to asses the quality of the estimator because it determines, to a large extent, the mutual information between the noisy neuronal responses, a, and the stimulus, s: the smaller the determinant of the covariance matrix, the larger the mutual information [3]. (Strictly speaking, this result applies only when the neurons are uncorrelated.…”
Section: Constructing Network That Perform Optimal Estimationmentioning
confidence: 99%