2020 IEEE International Symposium on Information Theory (ISIT) 2020
DOI: 10.1109/isit44484.2020.9174450
|View full text |Cite
|
Sign up to set email alerts
|

On Nonparametric Estimation of the Fisher Information

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
3
1

Relationship

3
1

Authors

Journals

citations
Cited by 4 publications
(11 citation statements)
references
References 20 publications
0
11
0
Order By: Relevance
“…We take the kernel to be k(y) = φ(y) = 1 √ 2π e − y 2 2 ; and 2) Estimate f (k) Y (y) by taking the derivative of (209) k times. The above estimators are inspired by the estimators of the score function and the Fisher information studied in [44], [45] and [46].…”
Section: A Empirical Bayes For Higher-order Conditional Momentsmentioning
confidence: 99%
“…We take the kernel to be k(y) = φ(y) = 1 √ 2π e − y 2 2 ; and 2) Estimate f (k) Y (y) by taking the derivative of (209) k times. The above estimators are inspired by the estimators of the score function and the Fisher information studied in [44], [45] and [46].…”
Section: A Empirical Bayes For Higher-order Conditional Momentsmentioning
confidence: 99%
“…Finally, we demonstrate the difference in the bounds on the convergence rates between Bhattacharya’s estimator and its clipped version by comparing the sample complexity of the two estimators, that is, the required number of samples to guarantee a given accuracy with a given confidence. MATLAB implementations of both estimators, as well as the code used to generate the figures below, can be found in [ 22 ].…”
Section: Estimation Of the Fisher Information Of A Random Variable In Gaussian Noisementioning
confidence: 99%
“…This trivially implies that X is -sub-Gaussian with . In order to make the comparison as fair as possible, the parameters of the kernel estimators, , , and , are not chosen according to Theorem 4 or Theorem 5, but are calculated by numerically minimizing the required number of samples; see [ 22 ] for details.…”
Section: Estimation Of the Fisher Information Of A Random Variable In Gaussian Noisementioning
confidence: 99%
See 1 more Smart Citation
“…Estimators for these quantities can be found e.g. in [3,4,8,20]. The proposition above can be generalized to richer classes of functions K, e.g.…”
Section: Compensating For Gainmentioning
confidence: 99%