2006
DOI: 10.1142/s0217984906011529
|View full text |Cite
|
Sign up to set email alerts
|

Boltzmann–shannon Entropy: Generalization and Application

Abstract: The paper deals with the generalization of both Boltzmann entropy and distribution in the light of most-probable interpretation of statistical equilibrium. The statistical analysis of the generalized entropy and distribution leads to some new interesting results of significant physical importance.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2008
2008
2021
2021

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(9 citation statements)
references
References 6 publications
0
9
0
Order By: Relevance
“…Thermodynamic probability in formula (3) is not an ordinary probability, but an integer. However, it is possible to modify Boltzmann entropy to enable the introduction of the notion of probability distribution in Boltzmann statistics [ 8 ]. If an isolated system, consisting of molecules belonging to n energy states is examined, then – assuming a fixed number of molecules and fixed values of the total energy – the total number of microscopic states of the system is given by the formula: Substituting (4) to (3), Boltzmann entropy is obtained in the following form: where and for a large means the probability that the molecule is in the i -th energy state.…”
Section: A Brief History Of the Emergence And Development Of The Ementioning
confidence: 99%
“…Thermodynamic probability in formula (3) is not an ordinary probability, but an integer. However, it is possible to modify Boltzmann entropy to enable the introduction of the notion of probability distribution in Boltzmann statistics [ 8 ]. If an isolated system, consisting of molecules belonging to n energy states is examined, then – assuming a fixed number of molecules and fixed values of the total energy – the total number of microscopic states of the system is given by the formula: Substituting (4) to (3), Boltzmann entropy is obtained in the following form: where and for a large means the probability that the molecule is in the i -th energy state.…”
Section: A Brief History Of the Emergence And Development Of The Ementioning
confidence: 99%
“…26, 27 There is a known relationship between Shannon entropy and thermodynamic entropy in statistical mechanics, such that entropy can be used as a measure of the molecular disorder of a system. 28 Applications of entropy in communication theory provide a way to quantify the amount of information associated with a received message. 11 The amount of information gained by the receiver depends on the probability that a message (or event) will occur.…”
Section: Discussionmentioning
confidence: 99%
“…To visualize of the system sustainability transformation, it is advisable to use the negentropy index calculated using the Shannon formula [Chen, Li, 2011;Chakrabarti, Chakrabarty, 2007;Gray, 2009]. Entropy is a measure of the scattering of possible states of a system as it changes (developed) over time.…”
Section: Input-oriented Modelmentioning
confidence: 99%