2020
DOI: 10.48550/arxiv.2010.06284
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Entropy-based test for generalized Gaussian distributions

Abstract: In this paper, we provide the proof of L 2 consistency for the kth nearest neighbour distance estimator of the Shannon entropy for an arbitrary fixed k ≥ 1. We construct the non-parametric test of goodness-of-fit for a class of introduced generalized multivariate Gaussian distributions based on a maximum entropy principle. The theoretical results are followed by numerical studies on simulated samples.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
2

Relationship

2
0

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 19 publications
0
3
0
Order By: Relevance
“…We apply Theorem 4.5. Repeating the lines of the proof of (Cadirci et al, 2020, Theorem 3), we compute E[ξ (0, P λ )], where…”
Section: Entropy Estimationmentioning
confidence: 99%
See 1 more Smart Citation
“…We apply Theorem 4.5. Repeating the lines of the proof of (Cadirci et al, 2020, Theorem 3), we compute E[ξ (0, P λ )], where…”
Section: Entropy Estimationmentioning
confidence: 99%
“…Lund & Jammalamadaka (2000) considered the entropy based test of goodness of fit for the von Mises distribution on the circle and use a different entropy estimate. Our study is motivated, particularly, by the work of Cadirci et al (2020), where the entropy based goodness of fit test for generalized Gaussian distribution is given.…”
Section: Introductionmentioning
confidence: 99%
“…The quadratic Rényi entropy was investigated by [18]. An entropy-based goodness-of-fit test for generalized Gaussian distributions is presented by [3]. A recent application to image processing can be found in [6].…”
Section: Introductionmentioning
confidence: 99%