2021
DOI: 10.1088/1361-6404/abe361
|View full text |Cite
|
Sign up to set email alerts
|

Variability as a better characterization of Shannon entropy

Abstract: The Shannon entropy, one of the cornerstones of information theory, is widely used in physics, particularly in statistical mechanics. Yet its characterization and connection to physics remain vague, leaving ample room for misconceptions and misunderstanding. We will show that the Shannon entropy can be fully understood as measuring the variability of the elements within a given distribution: it characterizes how much variation can be found within a collection of objects. We will see that it is the only indicat… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
12
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(13 citation statements)
references
References 24 publications
1
12
0
Order By: Relevance
“…One has to state here that entropy stands out from those measures, since it can be shown to measure the variability of the elements within a given distribution, and that its expression is not arbitrary, as it is the only linear indicator for such a concept [9].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…One has to state here that entropy stands out from those measures, since it can be shown to measure the variability of the elements within a given distribution, and that its expression is not arbitrary, as it is the only linear indicator for such a concept [9].…”
Section: Discussionmentioning
confidence: 99%
“…One part of the entropy is due to the standard deviation of the signal, while another part is related to the shape of its distribution. As such, it offers a fully relevant measure of variability [9].…”
Section: What Is Entropy ?mentioning
confidence: 99%
“…With the purpose of performing a quantitative analysis of the spatial dimension of behavior between the two groups and conditions, and to facilitate its comparison, in Figure 5 we show the entropy values related to the organism’s location for all rats in each group in the last five sessions in each condition. We implemented the entropy location measure because it apprehends in an objective and quantitative manner the variability of the elements within a given distribution (Carcasi & Aidala, 2021). Then, in this study, entropy describes the variability of the organism ′ s location distribution for a given whole session (7,200 frames).…”
Section: Resultsmentioning
confidence: 99%
“…Shannon entropy is a measure related to a discrete random variable, which indicates variability within a distribution, it is a continuous, monotonic, and linear indicator of how different the distribution elements are from each other (Carcassi, Aidala & Barbour, 2021).…”
Section: Accumulated Time Of Stays In Each Of 100 Zones Of the Mofs F...mentioning
confidence: 99%
“…Shannon introduced entropy theory into the field of information, which broadened the research of uncertainty measurement [27]. Since Shannon entropy was proposed, entropy theory has been widely developed and improved [45], [46], [47], [48], [49]. However, in Dempster-Shafer evidence theory, how to calculate the uncertainty of BPA is still a developmental problem.…”
Section: A New Methods For Measuring Uncertainty a Measure Uncertainty Based On Renyi Entropymentioning
confidence: 99%