2018
DOI: 10.3390/universe4010010
|View full text |Cite
|
Sign up to set email alerts
|

Entropic Distance for Nonlinear Master Equation

Abstract: More and more works deal with statistical systems far from equilibrium, dominated by unidirectional stochastic processes, augmented by rare resets. We analyze the construction of the entropic distance measure appropriate for such dynamics. We demonstrate that a power-like nonlinearity in the state probability in the master equation naturally leads to the Tsallis (HavrdaCharvát, Aczél-Daróczy) q-entropy formula in the context of seeking for the maximal entropy state at stationarity. A few possible applications … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1

Relationship

3
3

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 12 publications
0
7
0
Order By: Relevance
“…We conclude that a simple master equation with state dependent growth and reset terms [32,33] performs beautifully in describing the income distribution on the whole range of income values. In agreement with the simple reasoning, the analyses of real-world data revealed that the growth in salaries is on average preferential, and the reset rate depends in a simple form as a function of income, being negative for the low income region and saturates at a constant positive value for high salaries.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…We conclude that a simple master equation with state dependent growth and reset terms [32,33] performs beautifully in describing the income distribution on the whole range of income values. In agreement with the simple reasoning, the analyses of real-world data revealed that the growth in salaries is on average preferential, and the reset rate depends in a simple form as a function of income, being negative for the low income region and saturates at a constant positive value for high salaries.…”
Section: Discussionmentioning
confidence: 99%
“…In the above equation we denoted by δ(x) the Dirac functional. It has been proven, that under very general conditions the above dynamical evolution equation converges to a steady-state with a ρs(x) stationary probability density [33]. This stationary probability density is derived from the condition:…”
Section: The Growth and Reset Processmentioning
confidence: 99%
“…Although both ways are legitimate, in this paper we follow the second one. Given an evolution dynamics for the microstate probabilities, we first seek for an expedient formula for the entropic divergence [28]: a non-negative measure between two probability distributions (in a continuous model probability distribution functions (PDFs) which shrinks during the dynamical evolution. Possibly for two arbitrary initial distributions.…”
Section: Motivationmentioning
confidence: 99%
“…As part of this presentation, we repeat some formulas published already earlier [26,28]. Our purpose is to provide the reader with a self-contained chain of thoughts, followable without further reading.…”
Section: Motivationmentioning
confidence: 99%
“…We also compute Entropic Distance (ED) and compare it with AF. Entropic Distance, also called “relative entropy”, is the differences between entropies with and without a prior condition [ 20 ]. The conditional entropy of two variables X and Y taking values x and y , respectively, is defined by: where b is the logarithm base.…”
Section: Simulation Modelsmentioning
confidence: 99%