2015
DOI: 10.1140/epjb/e2015-60500-0
|View full text |Cite
|
Sign up to set email alerts
|

Information entropy of classical versus explosive percolation

Abstract: We study the Shannon entropy of the cluster size distribution in classical as well as explosive percolation, in order to estimate the uncertainty in the sizes of randomly chosen clusters. At the critical point the cluster size distribution is a power-law, i.e. there are clusters of all sizes, so one expects the information entropy to attain a maximum. As expected, our results show that the entropy attains a maximum at this point for classical percolation. Surprisingly, for explosive percolation the maximum ent… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
13
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(14 citation statements)
references
References 17 publications
1
13
0
Order By: Relevance
“…Interestingly, in [4], it was reported that the maximum point of H(t) is equal to t c in the ER model, whereas that of H(t) is less than t c in the EP models, as shown here in Fig. 1(a) and (b).…”
Section: Introductionmentioning
confidence: 50%
See 3 more Smart Citations
“…Interestingly, in [4], it was reported that the maximum point of H(t) is equal to t c in the ER model, whereas that of H(t) is less than t c in the EP models, as shown here in Fig. 1(a) and (b).…”
Section: Introductionmentioning
confidence: 50%
“…Using dCR, we are also able to understand why Ḣ(t) is minimum at t = t c and why Ḧ(t) diverges at t = t c in EP models, as reported in [4]. The rest of this paper is organized as follows.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…In percolation, we regard each cluster as equivalent to message hence the sum cannot be over the size of the clusters rather over the individual cluster label. In fact, none of the existing probabilities which we know in percolation theory can be used to measure entropy although there have been some attempts 43,44 .…”
Section: Entropy and Order Parametermentioning
confidence: 99%