We discuss algorithms for estimating the Shannon entropy h of finite symbol sequences with long range correlations. In particular, we consider algorithms which estimate h from the code lengths produced by some compression algorithm. Our interest is in describing their convergence with sequence length, assuming no limits for the space and time complexities of the compression algorithms. A scaling law is proposed for extrapolation from finite sample lengths. This is applied to sequences of dynamical systems in non-trivial chaotic regimes, a 1-D cellular automaton, and to written English texts.Partially random chains of symbols s 1 ,s 2 ,s 3 , . . . drawn from some finite alphabet "we restrict ourselves here to finite alphabets though most of our considerations would also apply to countable ones… appear in practically all sciences. Examples include spins in one-dimensional magnets, written texts, DNA sequences, geological records of the orientation of the magnetic field of the earth, and bits in the storage and transmission of digital data. An interesting question in all these contexts is to what degree these sequences can be ''compressed'' without losing any information. This question was first posed by Shannon 1 in a probabilistic context. He showed that the relevant quantity is the entropy "or average information content… h, which in the case of magnets coincides with the thermodynamic entropy of the spin degrees of freedom. Estimating the entropy is non-trivial in the presence of complex and long range correlations. In that case one has essentially to understand perfectly these correlations for optimal compression and entropy estimation, and thus estimates of h measure also the degree to which the structure of the sequence is understood.
We consider the problem of finite sample corrections for entropy estimation. New estimates of the Shannon entropy are proposed and their systematic error (the bias) is computed analytically. We find that our results cover correction formulas of current entropy estimates recently discussed in literature. The trade-off between bias reduction and the increase of the corresponding statistical error is analyzed. PACS: 89.70+c, 02.50.Fz, 05.45.Tp Statistical fluctuations of small samples induce both statistical and systematic deviations of entropy estimates. In the naive ("likelihood") estimator one replaces the discrete probabilities p i , for i = 1, ..., M , in the Shannon entropy [1]
We consider the Heisenberg uncertainty principle of position and momentum in 3-dimensional spaces of constant curvature K. The uncertainty of position is defined coordinate independent by the geodesic radius of spherical domains in which the particle is localized after a von Neumann-Lüders projection. By applying mathematical standard results from spectral analysis on manifolds, we obtain the largest lower bound of the momentum deviation in terms of the geodesic radius and K. For hyperbolic spaces, we also obtain a global lower bound σp ≥ |K| 1 2 , which is non-zero and independent of the uncertainty in position. Finally, the lower bound for the Schwarzschild radius of a static black hole is derived and given by rs ≥ 2 lP , where lP is the Planck length.
We consider particles prepared by a single slit diffraction experiment. For those particles the standard deviation σ p of the momentum is discussed. We find out that σ p = ∞ is not an exception but a rather typical case. A necessary and sufficient condition for σ p < ∞ is given. Finally, the inequality σ p x ≥ π is derived and it is shown that this bound cannot be improved.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.