2007
DOI: 10.1063/1.2821255
|View full text |Cite
|
Sign up to set email alerts
|

Origins of the Combinatorial Basis of Entropy

Abstract: The combinatorial basis of entropy, given by Boltzmann, can be written $H = N^{-1} \ln \mathbb{W}$, where $H$ is the dimensionless entropy, $N$ is the number of entities and $\mathbb{W}$ is number of ways in which a given realization of a system can occur (its statistical weight). This can be broadened to give generalized combinatorial (or probabilistic) definitions of entropy and cross-entropy: $H=\kappa (\phi(\mathbb{W}) +C)$ and $D=-\kappa (\phi(\mathbb{P}) +C)$, where $\mathbb{P}$ is the probability of a g… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
23
0

Year Published

2009
2009
2015
2015

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 13 publications
(23 citation statements)
references
References 76 publications
0
23
0
Order By: Relevance
“…Adding some more detail to this argument, I think there are two fundamental reasons why we are justified to posit the correspondence of MaxEnt and MEP, beyond the purely mathematical considerations (Niven 2007(Niven , 2009). One is an epistemological.…”
Section: In a General Structure Of Evolving Systems Under Natural Selmentioning
confidence: 99%
See 1 more Smart Citation
“…Adding some more detail to this argument, I think there are two fundamental reasons why we are justified to posit the correspondence of MaxEnt and MEP, beyond the purely mathematical considerations (Niven 2007(Niven , 2009). One is an epistemological.…”
Section: In a General Structure Of Evolving Systems Under Natural Selmentioning
confidence: 99%
“…This is to reconcile the thermodynamic use of entropy and the information theoretic use, which has been achieved for long in the general mathematical treatment of entropy (surveyed e.g. by Niven 2007). Both have been scrutinized by Ayres (1994) in his seminal contribution, and Ayres had already developed two important insights, starting out from an observation that directly matches with Georgescu-Roegen's original qualms with statistical mechanics: This is that information is an intensive (hence, qualitative, or dialectic variable, in Georgescu-Roegen's parlance), whereas entropy in thermodynamics is an extensive variable (Ayres 1994: 36).…”
Section: The Problem Of the Contextuality Of Entropy: Fertile Ground mentioning
confidence: 99%
“…where H is the dimensionless entropy per particle and N is the (actual) number of particles [3]. Eq.…”
Section: Introductionmentioning
confidence: 99%
“…Maximisation of H (MaxEnt) or minimisation of D (MinXEnt), subject to its constraints, therefore selects the realization of highest weight W or probability P, a technique which can be termed the maximum probability principle (MaxProb) [1,2,3,4,5]. The inferred distribution is then used to represent the system.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation