2003
DOI: 10.1063/1.1530990
|View full text |Cite
|
Sign up to set email alerts
|

Regularities unseen, randomness observed: Levels of entropy convergence

Abstract: We study how the Shannon entropy of sequences produced by an information source converges to the source's entropy rate. We synthesize several phenomenological approaches to applying information theoretic measures of randomness and memory to stochastic and deterministic processes by using successive derivatives of the Shannon entropy growth curve. This leads, in turn, to natural measures of apparent memory stored in a source and the amounts of information that must be extracted from observations of a source in … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

5
522
0
4

Year Published

2008
2008
2021
2021

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 328 publications
(531 citation statements)
references
References 67 publications
5
522
0
4
Order By: Relevance
“…This is equivalent to the more typical random-variable "block" definition of entropy rate [11]: lim δ→∞ H[T 0:δ ]/δ. Similarly, we define the single-measurement entropy rate as:…”
Section: Differential Information Ratesmentioning
confidence: 99%
See 1 more Smart Citation
“…This is equivalent to the more typical random-variable "block" definition of entropy rate [11]: lim δ→∞ H[T 0:δ ]/δ. Similarly, we define the single-measurement entropy rate as:…”
Section: Differential Information Ratesmentioning
confidence: 99%
“…To develop it, Sec. II describes the required new notation and definitions that enable extending the -machine framework which is otherwise well understood for discrete-time processes [1,11]. Sections III-V determine the causal and informational architecture of continuous-time renewal processes.…”
Section: Introductionmentioning
confidence: 99%
“…The information in an observation can be partitioned into two pieces: redundancy and entropy generation [1]. Our approach exploits this decomposition in order to assess how much predictive structure is present in a signal-i.e., where it falls on the complexity spectrum mentioned above.…”
Section: Introductionmentioning
confidence: 99%
“…In particular, the determinism of the state transitions enables direct calculation of information-theoretic quantities, such as the process's rate of information production (source entropy rate) and the amount of historical information it stores (statistical complexity) [3,14]. Such properties cannot be calculated from an HMM representation that is not an -machine.…”
Section: -Machinesmentioning
confidence: 99%
“…For example, the strings 0110 and 011110 are in its language, but the string 010 is not. It turns out that this process is not equivalent to a Markov Chain of any finite order [14]. In such cases, one must employ a more sophisticated model class such as Hidden Markov Models (HMMs).…”
Section: Markov Modelsmentioning
confidence: 99%