We study how the Shannon entropy of sequences produced by an information source converges to the source's entropy rate. We synthesize several phenomenological approaches to applying information theoretic measures of randomness and memory to stochastic and deterministic processes by using successive derivatives of the Shannon entropy growth curve. This leads, in turn, to natural measures of apparent memory stored in a source and the amounts of information that must be extracted from observations of a source in order for it to be optimally predicted and for an observer to synchronize to it. One consequence of ignoring these structural properties is that the missed regularities are converted to apparent randomness. We demonstrate that this problem arises particularly for small data sets; e.g., in settings where one has access only to short measurement sequences.
Intrinsic computation refers to how dynamical systems store, structure, and transform historical and spatial information. By graphing a measure of structural complexity against a measure of randomness, complexity-entropy diagrams display the range and different kinds of intrinsic computation across an entire class of system. Here, we use complexity-entropy diagrams to analyze intrinsic computation in a broad array of deterministic nonlinear and linear stochastic processes, including maps of the interval, cellular automata and Ising spin systems in one and two dimensions, Markov chains, and probabilistic minimal finite-state machines. Since complexity-entropy diagrams are a function only of observed configurations, they can be used to compare systems without reference to system coordinates or parameters. It has been known for some time that in special cases complexity-entropy diagrams reveal that high degrees of information processing are associated with phase transitions in the underlying process space, the so-called "edge of chaos". Generally, though, complexity-entropy diagrams differ substantially in character, demonstrating a genuine diversity of distinct kinds of intrinsic computation. Discovering organization in the natural world is one of science's central goals. Recent innovations in nonlinear mathematics and physics, in concert with analyses of how dynamical systems store and process information, has produced a growing body of results on quantitative ways to measure natural organization. These efforts had their origin in earlier investigations of the origins of randomness. Eventually, however, it was realized that measures of randomness do not capture the property of organization. This led to the recent efforts to develop measures that are, on the one hand, as generally applicable as the randomness measures but which, on the other, capture a system's complexity-its organization, structure, memory, regularity, symmetry, and pattern. Here-analyzing processes from dynamical systems, statistical mechanics, stochastic processes, and automata theory-we show that measures of structural complexity are a necessary and useful complement to describing natural systems only in terms of their randomness. The result is a broad appreciation of the kinds of information processing embedded in nonlinear systems. This, in turn, * Electronic address:
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.