2005
DOI: 10.1155/ijmms.2005.2847
|View full text |Cite
|
Sign up to set email alerts
|

Shannon entropy: axiomatic characterization and application

Abstract: We have presented a new axiomatic derivation of Shannon entropy for a discrete probability distribution on the basis of the postulates of additivity and concavity of the entropy function. We have then modified Shannon entropy to take account of observational uncertainty.The modified entropy reduces, in the limiting case, to the form of Shannon differential entropy. As an application, we have derived the expression for classical entropy of statistical mechanics from the quantized form of the entropy.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2010
2010
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(19 citation statements)
references
References 15 publications
0
18
0
Order By: Relevance
“…There is a vast literature on characterizations of Shannon entropy, see e.g. [16], [17], [18], [19]. Our result is a slightly weaker version of the characterization in [10, Lemma 5], which does not presuppose continuity, and (in addition to symmetry and additivity) only assumes weak subadditivity, that is (ii) in the special case that B has dimension two.…”
Section: Main Results and Their Contextmentioning
confidence: 77%
See 1 more Smart Citation
“…There is a vast literature on characterizations of Shannon entropy, see e.g. [16], [17], [18], [19]. Our result is a slightly weaker version of the characterization in [10, Lemma 5], which does not presuppose continuity, and (in addition to symmetry and additivity) only assumes weak subadditivity, that is (ii) in the special case that B has dimension two.…”
Section: Main Results and Their Contextmentioning
confidence: 77%
“…Combining (12), (16), (17), (18), and (19), and setting N := max{N (a), N ′ , N ′′ , N ′′′ }, we get…”
Section: It Is Then Elementary To See That the Inequality Hmentioning
confidence: 99%
“…Consider the function H(x, h) := −k B ln(hρ(x)) and define the differential entropy (or continuous entropy) for a PDF ρ(x) as [4,6,[14][15][16] H (ρ, h) :…”
Section: On the Continuous Shannon And Rényi Entropiesmentioning
confidence: 99%
“…(10) reduces to the standard definition of differential entropy and for h > 0, to the modified differential entropy proposed in [15], in which the range a ≤ x ≤ b is divided into bins of width h. The constant k B > 0 fixes the unit of measurement of H(ρ, h), ln(· · ·) is the natural logarithm giving the information in nats, and by convention 0 ln 0 = 0 (for visual representation, plot x ln(x)). For a segment of H(x, h) defined by an interval [x n−1 , x n ] in which either ρ(x n−1 ) → 0 or ρ(x n ) → 0, then H(x n−1 , h) → ∞ or H(x n , h) → ∞ but, according to the convention, H(x n−1 , h) × 0 → 0 or H(x n , h) × 0 → 0.…”
Section: On the Continuous Shannon And Rényi Entropiesmentioning
confidence: 99%
“…(1.5) Chakrabarti and Chakrabarti [2] have presented a new axiomatic derivation of Shannon entropy for a discrete probability distribution on the basis of the postulates of additivity and concavity of the entropy function. The authors have then modified Shannon entropy to take account of observational uncertainty.…”
Section: Introductionmentioning
confidence: 99%