2016
DOI: 10.3390/e18020042
|View full text |Cite
|
Sign up to set email alerts
|

Non-Extensive Entropic Distance Based on Diffusion: Restrictions on Parameters in Entropy Formulae

Abstract: Based on a diffusion-like master equation we propose a formula using the Bregman divergence for measuring entropic distance in terms of different non-extensive entropy expressions. We obtain the non-extensivity parameter range for a universal approach to the stationary distribution by simple diffusive dynamics for the Tsallis and the Kaniadakis entropies, for the Hanel-Thurner generalization, and finally for a recently suggested log-log type entropy formula which belongs to diverging variance in the inverse te… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 38 publications
0
4
0
Order By: Relevance
“…The change of the entropic distance is governed by its definition and the evolution equation for the distribution. The entropic distance of an actual, time-dependent distribution, P n (t), to the stationary distribution, Q n has the trace form [40]:…”
Section: Rates Leading To Rates Leading To Gamma Distribution Gamma Dmentioning
confidence: 99%
“…The change of the entropic distance is governed by its definition and the evolution equation for the distribution. The entropic distance of an actual, time-dependent distribution, P n (t), to the stationary distribution, Q n has the trace form [40]:…”
Section: Rates Leading To Rates Leading To Gamma Distribution Gamma Dmentioning
confidence: 99%
“…Using further assumptions about reservoir fluctuations, further entropy formulas can be constructed, as expectation values of formal logarithms, behaving additively [194].…”
Section: Physical Sources Of Nb Statesmentioning
confidence: 99%
“…The conditional entropy of two variables X and Y taking values x and y , respectively, is defined by: where b is the logarithm base. ED has the following properties [ 21 ]: ED is symmetric. ED is zero for comparing a distribution with itself.…”
Section: Simulation Modelsmentioning
confidence: 99%