2009
DOI: 10.1590/s0103-97332009000400011
|View full text |Cite
|
Sign up to set email alerts
|

Generalized-generalized entropies and limit distributions

Abstract: Limit distributions are not limited to uncorrelated variables but can be constructively derived for a large class of correlated random variables, as was shown e.g. in the context of large deviation theory [1], and recently in a very general setting by Hilhorst and Schehr [2]. At the same time it has been conjectured, based on numerical evidence, that several limit distributions originating from specific correlated random processes follow q-Gaussians. It could be shown that this is not the case for some of thes… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 17 publications
(39 reference statements)
0
4
0
Order By: Relevance
“…The summary of results of his investigation is presented in Table 2. While the work of Wang (2008) deals with determination of entropy for a single variable case, Thurner and Hanel (2009) dealt with correlated Gaussian variables and found that the entropy of limiting distribution is non-Tsallis and they gave a generalized expression for entropy. Okionomou and Bagci (2017) showed, using classical calculus, that the MaxEnt distributions using Tsallis (2009) entropy and Renyi's entropy for specified constraints (information) are different.…”
Section: Information Theoretic Entropy and Mep: Brief Reviewmentioning
confidence: 99%
“…The summary of results of his investigation is presented in Table 2. While the work of Wang (2008) deals with determination of entropy for a single variable case, Thurner and Hanel (2009) dealt with correlated Gaussian variables and found that the entropy of limiting distribution is non-Tsallis and they gave a generalized expression for entropy. Okionomou and Bagci (2017) showed, using classical calculus, that the MaxEnt distributions using Tsallis (2009) entropy and Renyi's entropy for specified constraints (information) are different.…”
Section: Information Theoretic Entropy and Mep: Brief Reviewmentioning
confidence: 99%
“…It is worthy to note that generalized trace-form entropies and the related distributions by means of a variational principleà la Jaynes have been discussed in [45,51,52]. In particular, [51,52] have derived the most general trace-form entropy compatible with the first three Khinchin axioms that result to be a sub-family of the most general class (9).…”
Section: Preliminarymentioning
confidence: 99%
“…In particular, [51,52] have derived the most general trace-form entropy compatible with the first three Khinchin axioms that result to be a sub-family of the most general class (9).…”
Section: Preliminarymentioning
confidence: 99%
“…(25), in the cases δ = 2 and δ = 3, for which the equilibrium distributions were calculated exactly for a general φ(x) in the previous section [cf. Eqs.…”
Section: Normalization and Chemical Potential Analysismentioning
confidence: 99%