The relation between nonanticipative Rate Distortion Function (RDF) and filtering theory is discussed on abstract spaces. The relation is established by imposing a realizability constraint on the reconstruction conditional distribution of the classical RDF. Existence of the extremum solution of the nonanticipative RDF is shown using weak * -convergence on appropriate topology. The extremum reconstruction conditional distribution is derived in closed form, for the case of stationary processes.The realization of the reconstruction conditional distribution which achieves the infimum of the nonanticipative RDF is described. Finally, an example is presented to illustrate the concepts.
We deal with zero-delay source coding of a vector-valued Gauss-Markov source subject to a mean-squared error (MSE) fidelity criterion characterized by the operational zero-delay vector-valued Gaussian rate distortion function (RDF). We address this problem by considering the nonanticipative RDF (NRDF) which is a lower bound to the causal optimal performance theoretically attainable (OPTA) function (or simply causal RDF) and operational zero-delay RDF. We recall the realization that corresponds to the optimal "test-channel" of the Gaussian NRDF, when considering a vector Gauss-Markov source subject to a MSE distortion in the finite time horizon. Then, we introduce sufficient conditions to show existence of solution for this problem in the infinite time horizon (or asymptotic regime). For the asymptotic regime, we use the asymptotic characterization of the Gaussian NRDF to provide a new equivalent realization scheme with feedback which is characterized by a resource allocation (reverse-waterfilling) problem across the dimension of the vector source. We leverage the new realization to derive a predictive coding scheme via lattice quantization with subtractive dither and joint memoryless entropy coding. This coding scheme offers an upper bound to the operational zero-delay vector-valued Gaussian RDF. When we use scalar quantization, then for r active dimensions of the vector Gauss-Markov source the gap between the obtained lower and theoretical upper bounds is less than or equal to 0.254r + 1 bits/vector. However, we further show that it is possible when we use vector quantization, and assume infinite dimensional Gauss-Markov sources to make the previous gap to be negligible, i.e., Gaussian NRDF approximates the operational zero-delay Gaussian RDF. We also extend our results to vector-valued Gaussian sources of any finite memory under mild conditions. Our theoretical framework is demonstrated with illustrative numerical experiments.
In this paper, we develop finite-time horizon causal filters using the nonanticipative rate distortion theory. We apply the developed theory to design optimal filters for time-varying multidimensional Gauss-Markov processes, subject to a mean square error fidelity constraint. We show that such filters are equivalent to the design of an optimal {encoder, channel, decoder}, which ensures that the error satisfies a fidelity constraint. Moreover, we derive a universal lower bound on the mean square error of any estimator of time-varying multidimensional Gauss-Markov processes in terms of conditional mutual information. Unlike classical Kalman filters, the filter developed is characterized by a reverse-waterfilling algorithm, which ensures that the fidelity constraint is satisfied. The theoretical results are demonstrated via illustrative examples.
Directed information or its variants are utilized extensively in the characterization of the capacity of channels with memory and feedback, nonanticipative lossy data compression, and their generalizations to networks.In this paper, we derive several functional and topological properties of directed information for general abstract alphabets (complete separable metric spaces) using the topology of weak convergence of probability measures. These include convexity of the set of consistent distributions, which uniquely define causally conditioned distributions, convexity and concavity of directed information with respect to the sets of consistent distributions, weak compactness of these sets of distributions, their joint distributions and their marginals. Furthermore, we show lower semicontinuity of directed information, and under certain conditions we also establish continuity of directed information. Finally, we derive variational equalities for directed information, including sequential versions. These may be viewed as the analogue of the variational equalities of mutual information (utilized in Blahut-Arimoto algorithm).In summary, we extend the basic functional and topological properties of mutual information to directed information. These properties are discussed in the context of extremum problems of directed information.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.