2002
DOI: 10.1016/s0378-4371(01)00502-7
|View full text |Cite
|
Sign up to set email alerts
|

Entropy, extropy and information potential in stochastic systems far from equilibrium

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
8
0

Year Published

2003
2003
2021
2021

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 13 publications
(8 citation statements)
references
References 37 publications
0
8
0
Order By: Relevance
“…The quantity −S(p 0 |p eq ) has been called the "extropy" of the distribution p 0 [4,11]: it is the total entropy production, or loss of information, when the system relaxes from p 0 to equilibrium when external constraints are suppressed. Since it is positive in non-equilibrium conditions, inequality (11) leaves the possibility of having positive work production in a non-equilibrium system, if the work necessary for creating or maintaining non-equilibrium is not taken into account.…”
Section: Constrained Non-equilibrium Systems Information Potentialmentioning
confidence: 99%
See 1 more Smart Citation
“…The quantity −S(p 0 |p eq ) has been called the "extropy" of the distribution p 0 [4,11]: it is the total entropy production, or loss of information, when the system relaxes from p 0 to equilibrium when external constraints are suppressed. Since it is positive in non-equilibrium conditions, inequality (11) leaves the possibility of having positive work production in a non-equilibrium system, if the work necessary for creating or maintaining non-equilibrium is not taken into account.…”
Section: Constrained Non-equilibrium Systems Information Potentialmentioning
confidence: 99%
“…This is why we have recently started to develop another framework for non-equilibrium thermodynamics, based on the notion of coarse grained descriptions and of stochastic dynamics in the corresponding state space (see [2][3][4][5]). On the other hand, a different approach was introduced by Jarzynski [6], based on a Hamiltonian formalism, which was later widely discussed (see for instance [7][8][9][10] and references therein).…”
Section: Introductionmentioning
confidence: 99%
“…For such a system, when the control parameter is held fixed, this free energy difference is a nonincreasing function of time. If the system is allowed to fully equilibrate, it is equal to the total entropy produced (also known as the extropy) [3]. Hence this free energy difference has also been called an entropy deficiency [4].…”
mentioning
confidence: 99%
“…However, this binary information should be brought into correspondence with the measure of information expressed in thermodynamic terms. Here we should take into account that I M (in contrast to I S ) corresponds rather to the concept of extropy (the distance of a physical system from its equilibrium state on the entropic scale, i.e., the distance from the maximum of the entropy) [1], than to that of entropy, if one takes into account the thermodynamic analogies.…”
Section: Language Of Informationmentioning
confidence: 99%