This paper introduces the notion of exact common information, which is the minimum description length of the common randomness needed for the exact distributed generation of two correlated random variables (X, Y ). We introduce the quantity G(X; Y ) = minX→W →Y H(W ) as a natural bound on the exact common information and study its properties and computation. We then introduce the exact common information rate, which is the minimum description rate of the common randomness for the exact generation of a 2-DMS (X, Y ). We give a multiletter characterization for it as the limit G(X; Y ) = limn→∞(1/n)G(X n ; Y n ). While in general G(X; Y ) is greater than or equal to the Wyner common information, we show that they are equal for the Symmetric Binary Erasure Source. We do not know, however, if the exact common information rate has a single letter characterization in general.
We establish the first known upper bound on the exact and Wyner's common information of n continuous random variables in terms of the dual total correlation between them (which is a generalization of mutual information). In particular, we show that when the pdf of the random variables is log-concave, there is a constant gap of n 2 log e + 9n log n between this upper bound and the dual total correlation lower bound that does not depend on the distribution. The upper bound is obtained using a computationally efficient dyadic decomposition scheme for constructing a discrete common randomness variable W from which the n random variables can be simulated in a distributed manner. We then bound the entropy of W using a new measure, which we refer to as the erosion entropy.
This paper shows that the Hirschfeld-Gebelein-Rényi maximal correlation between the message and the ciphertext provides good secrecy guarantees for cryptosystems that use short keys. We first establish a bound on the eavesdropper's advantage in guessing functions of the message in terms of maximal correlation and the Rényi entropy of the message. This result implies that maximal correlation is stronger than the notion of entropic security introduced by Russell and Wang. We then show that a small maximal correlation ρ can be achieved via a randomly generated cipher with key length ≈ 2 log(1/ρ), independent of the message length, and by a stream cipher with key length 2 log(1/ρ) + log n + 2 for a message of length n. We establish a converse showing that these ciphers are close to optimal. This is in contrast to entropic security for which there is a gap between the lower and upper bounds. Finally, we show that a small maximal correlation implies secrecy with respect to several mutual information based criteria but is not necessarily implied by them. Hence, maximal correlation is a stronger and more practically relevant measure of secrecy than mutual information.
Existing fixed-length feedback communication schemes are either specialized to particular channels (Schalkwijk-Kailath, Horstein), or apply to general channels but either have high coding complexity (block feedback schemes) or are difficult to analyze (posterior matching). This paper introduces a new fixed-length feedback coding scheme which achieves the capacity for all discrete memoryless channels, has an error exponent that approaches the sphere packing bound as the rate approaches the capacity, and has O(n log n) coding complexity. These benefits are achieved by judiciously combining features from previous schemes with new randomization technique and encoding/decoding rule. These new features make the analysis of the error probability for the new scheme easier than for posterior matching.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.