1992
DOI: 10.1111/j.1467-842x.1992.tb01043.x
|View full text |Cite
|
Sign up to set email alerts
|

On the Fourth Root Transformation of Chi‐square

Abstract: We show that, within the family of power transformations of a Chisquare variable, the square and fourth roots m i n i z e Pearson's index of kurtosis. Two new transtormations of the fourth root, a symmetrized-truncated version and its linear combination with the square root are also studied. The first transformation shows a considerable improvement over the fourth root while the second one turns out to be even more accurate than Hilferty-Wilson's cube root transformation.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

2
19
0

Year Published

2005
2005
2021
2021

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(21 citation statements)
references
References 10 publications
2
19
0
Order By: Relevance
“…The motivation for this generalization lies in the statistical investigation done in [12], which analyzes the Kullback-Leibler (KL) divergence of a power-transformed chisquared random variable from a Gaussian random variable with suitable mean and variance: the results of this analysis clearly show that the KL divergence is minimized for power exponents ranging from 3 r = to 4 r = . A similar result is obtained by [13], which compares the cumulants of a power-transformed chi-squared random variable with the cumulants of a Gaussian random variable: if we match either the skewness or the kurtosis of the two random variables, we obtain integer power exponents 2 4 r ≤ ≤ . As a consequence, herein we propose to approximate a chi-squared random variable x , normalized by the number of DOF 2N , with the rth power of a Gaussian random variable, as expressed by 2 2…”
Section: B Power Transformationsupporting
confidence: 75%
See 4 more Smart Citations
“…The motivation for this generalization lies in the statistical investigation done in [12], which analyzes the Kullback-Leibler (KL) divergence of a power-transformed chisquared random variable from a Gaussian random variable with suitable mean and variance: the results of this analysis clearly show that the KL divergence is minimized for power exponents ranging from 3 r = to 4 r = . A similar result is obtained by [13], which compares the cumulants of a power-transformed chi-squared random variable with the cumulants of a Gaussian random variable: if we match either the skewness or the kurtosis of the two random variables, we obtain integer power exponents 2 4 r ≤ ≤ . As a consequence, herein we propose to approximate a chi-squared random variable x , normalized by the number of DOF 2N , with the rth power of a Gaussian random variable, as expressed by 2 2…”
Section: B Power Transformationsupporting
confidence: 75%
“…where ( ) r m N and ( ) r V N are the mean and the variance of the Gaussian variable, expressed respectively by [11], [13] 2016…”
Section: B Power Transformationmentioning
confidence: 99%
See 3 more Smart Citations