2022
DOI: 10.1017/9781009053730
|View full text |Cite
|
Sign up to set email alerts
|

The Fundamentals of Heavy Tails

Abstract: Heavy tails –extreme events or values more common than expected –emerge everywhere: the economy, natural events, and social and information networks are just a few examples. Yet after decades of progress, they are still treated as mysterious, surprising, and even controversial, primarily because the necessary mathematical models and statistical methods are not widely known. This book, for the first time, provides a rigorous introduction to heavy-tailed distributions accessible to anyone who knows elementary pr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
28
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(32 citation statements)
references
References 0 publications
1
28
0
Order By: Relevance
“…For the sum of dependent random variables, the tail heaviness of the aggregated random variable depends on the heaviness of the marginal tails and the dependence of the tails (Albrecher et al, 2006;Kortschak & Albrecher, 2009). For the additive aggregation of many random variables, the aggregation via the mean may remove heavy tails, as a consequence of a variation on the classical Central Limit Theorem (Billingsley, 1995;Nair et al, 2017). However, heavy tails are preserved in the limiting distribution of mean-aggregated processes, if the condition of finite variance of the components is not met, resulting in α-stable distribution (Table 1, for details, see Nair et al [2017]).…”
Section: Arithmetic Combination Of Random Variables (S1)mentioning
confidence: 99%
“…For the sum of dependent random variables, the tail heaviness of the aggregated random variable depends on the heaviness of the marginal tails and the dependence of the tails (Albrecher et al, 2006;Kortschak & Albrecher, 2009). For the additive aggregation of many random variables, the aggregation via the mean may remove heavy tails, as a consequence of a variation on the classical Central Limit Theorem (Billingsley, 1995;Nair et al, 2017). However, heavy tails are preserved in the limiting distribution of mean-aggregated processes, if the condition of finite variance of the components is not met, resulting in α-stable distribution (Table 1, for details, see Nair et al [2017]).…”
Section: Arithmetic Combination Of Random Variables (S1)mentioning
confidence: 99%
“…These observations are consistent with the corresponding PDFs shown in figure 4. Namely, because BG noise is a mixture of two Gaussian distributions, BG noise has rapidly decaying 'light' tails, whereas SαS noise has slowly decaying 'heavy' tails with a higher probability of extreme values [71].…”
Section: Impulsive Noisementioning
confidence: 99%
“…Consequently, for the impulsive noise types, we also investigated replacing the feature min-max scaling of the target data with a quantile transformation [78, section 7.4.1] applied independently to each channel to make the data approximately follow a standard normal distribution. The motivations for this transformation were twofold: (1) it ensured that the distribution of each channel was unimodal with 'light' tails [71], and (2) it effectively limited the impact of outliers.…”
Section: Quantile Data Transformation For Impulsive Noisementioning
confidence: 99%
“…), making it sensitive to non-linear relationships in data [ 21 ]. It is well-known that Gaussian models can fail when the underlying generative dynamic is strongly non-normal, such as in heavy-tailed distributions (for example, the distribution of wealth in the United States) [ 22 , 23 ], temporally extended human dynamics, such as epidemic spreading [ 24 ] and these nonlinearities can lead to erroneous conclusions and incorrect models [ 25 ]. Finally, for researchers who do wish to leverage the power of linear models, or work with continuous data, closed-form Gaussian estimators of all major information-theoretic relationships exist: for example, the Pearson correlation is a function of the mutual information between Gaussian variables [ 21 ], the partial correlation maps to the conditional mutual information [ 26 ], and the Granger Causality is a special case of the more general transfer entropy [ 27 ].…”
Section: Introductionmentioning
confidence: 99%