2010
DOI: 10.1080/00949650903005656
|View full text |Cite
|
Sign up to set email alerts
|

A new estimator of entropy and its application in testing normality

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
9
1

Relationship

1
9

Authors

Journals

citations
Cited by 67 publications
(8 citation statements)
references
References 13 publications
0
8
0
Order By: Relevance
“…Now the entropy is a characteristic playing a fundamental role not only in information theory and communication, but also in classification, pattern recognition, statistical physics, stochastic dynamics, statistics, etc. Just in statistics Shannon's entropy is used as a descriptive parameter (measure of dispersion), for testing normality (Vasicek in [15], Arizono and Ohta in [1], Noughabi in [10]), exponentiality (Grzegorzewski and Wieczorkowski in [6], Taufer in [13]) and uniformity (Dudewicz, et al in [4]), etc. (see also [3], [8], [11], [16], [17]).…”
Section: Entropymentioning
confidence: 99%
“…Now the entropy is a characteristic playing a fundamental role not only in information theory and communication, but also in classification, pattern recognition, statistical physics, stochastic dynamics, statistics, etc. Just in statistics Shannon's entropy is used as a descriptive parameter (measure of dispersion), for testing normality (Vasicek in [15], Arizono and Ohta in [1], Noughabi in [10]), exponentiality (Grzegorzewski and Wieczorkowski in [6], Taufer in [13]) and uniformity (Dudewicz, et al in [4]), etc. (see also [3], [8], [11], [16], [17]).…”
Section: Entropymentioning
confidence: 99%
“…2 of 21 methods have been discussed as a non-parametric approach in Refs. [13,14]. This strategy is flexible and robust because it does not enforce a model or parametric constraints.…”
Section: Introductionmentioning
confidence: 99%
“…Shannon entropy is a crucial descriptive parameter in statistics, especially for evaluating data dispersion and performing tests for normality, exponentiality and uniformity [11,12]. Entropy estimation is challenging, especially when the model is unknown; in these cases, non-parametric methods, as those based on spacings [13,14], can be used. This strategy is flexible and robust because it does not enforce a model or parametric constraints.…”
Section: Introductionmentioning
confidence: 99%