We show that Skilling's method of induction leads to a unique general theory of inductive inference, the method of Maximum relative Entropy (ME). The main tool for updating probabilities is the logarithmic relative entropy; other entropies such as those of Renyi or Tsallis are ruled out. We also show that Bayes updating is a special case of ME updating and thus, that the two are completely compatible.
Abstract. We use the method of Maximum (relative) Entropy to process information in the form of observed data and moment constraints. The generic "canonical" form of the posterior distribution for the problem of simultaneous updating with data and moments is obtained. We discuss the general problem of non-commuting constraints, when they should be processed sequentially and when simultaneously. As an illustration, the multinomial example of die tosses is solved in detail for two superficially similar but actually very different problems.
We study the information geometry and the entropic dynamics of a 3D Gaussian statistical model. We then compare our analysis to that of a 2D Gaussian statistical model obtained from the higher-dimensional model via introduction of an additional information constraint that resembles the quantum mechanical canonical minimum uncertainty relation. We show that the chaoticity (temporal complexity) of the 2D Gaussian statistical model, quantified by means of the Information Geometric Entropy (IGE) and the Jacobi vector field intensity, is softened with respect to the chaoticity of the 3D Gaussian statistical model.
In 1960, Rudolf E. Kalman created what is known as the Kalman filter, which is a way to estimate unknown variables from noisy measurements. The algorithm follows the logic that if the previous state of the system is known, it could be used as the best guess for the current state. This information is first applied a priori to any measurement by using it in the underlying dynamics of the system. Second, measurements of the unknown variables are taken. These two pieces of information are taken into account to determine the current state of the system. Bayesian inference is specifically designed to accommodate the problem of updating what we think of the world based on partial or uncertain information. In this paper, we present a derivation of the general Bayesian filter, then adapt it for Markov systems. A simple example is shown for pedagogical purposes. We also show that by using the Kalman assumptions or "constraints", we can arrive at the Kalman filter using the method of maximum (relative) entropy (MrE), which goes beyond Bayesian methods. Finally, we derive a generalized, nonlinear filter using MrE, where the original Kalman Filter is a special case. We further show that the variable relationship can be any function, and thus, approximations, such as the extended Kalman filter, the unscented Kalman filter and other Kalman variants are special cases as well.
In a previous paper , we compared an uncorrelated 3D Gaussian statistical model to an uncorrelated 2D Gaussian statistical model obtained from the former model by introducing a constraint that resembles the quantum mechanical canonical minimum uncertainty relation. Analysis was completed by way of the information geometry and the entropic dynamics of each system. This analysis revealed that the chaoticity of the 2D Gaussian statistical model, quantified by means of the Information Geometric Entropy (IGE), is softened or weakened with respect to the chaoticity of the 3D Gaussian statistical model, due to the accessibility of more information. In this companion work, we further constrain the system in the context of a correlation constraint among the system's micro-variables and show that the chaoticity is further weakened, but only locally. Finally, the physicality of the constraints is briefly discussed, particularly in the context of quantum entanglement.
Information geometry and inductive inference methods can be used to model dynamical systems in terms of their probabilistic description on curved statistical manifolds.In this article, we present a formal conceptual reexamination of the information geometric construction of entropic indicators of complexity for statistical models. Specifically, we present conceptual advances in the interpretation of the information geometric entropy (IGE), a statistical indicator of temporal complexity (chaoticity) defined on curved statistical manifolds underlying the probabilistic dynamics of physical systems.
Entropic Dynamics (ED) [1] is a theoretical framework developed to investigate the possibility that laws of physics reflect laws of inference rather than laws of nature. In this work, a RED (Reversible Entropic Dynamics) model is considered. The geometric structure underlying the curved statistical manifold M s is studied. The trajectories of this particular model are hyperbolic curves (geodesics) on M s . Finally, some analysis concerning the stability of these geodesics on M s is carried out.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.