2017
DOI: 10.1109/tit.2017.2735438
|View full text |Cite
|
Sign up to set email alerts
|

Distributed Simulation of Continuous Random Variables

Abstract: We establish the first known upper bound on the exact and Wyner's common information of n continuous random variables in terms of the dual total correlation between them (which is a generalization of mutual information). In particular, we show that when the pdf of the random variables is log-concave, there is a constant gap of n 2 log e + 9n log n between this upper bound and the dual total correlation lower bound that does not depend on the distribution. The upper bound is obtained using a computationally eff… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
39
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 24 publications
(40 citation statements)
references
References 15 publications
1
39
0
Order By: Relevance
“…Alternatively, we can consider the expected length of common randomness required to generate the target joint distribution exactly. Such a variant of the problem, termed exact common information, was studied in [109] (see also [111] for a protocol that exactly generates target distributions on continuous alphabets). The exact common information is larger than or equal to the Wyner common information by definition.…”
Section: B Wyner Common Informationmentioning
confidence: 99%
“…Alternatively, we can consider the expected length of common randomness required to generate the target joint distribution exactly. Such a variant of the problem, termed exact common information, was studied in [109] (see also [111] for a protocol that exactly generates target distributions on continuous alphabets). The exact common information is larger than or equal to the Wyner common information by definition.…”
Section: B Wyner Common Informationmentioning
confidence: 99%
“…However, in this paper (as well as in [7]), we combine it with distribution truncation techniques to analyze sources with countably infinite alphabets. We also combine the mixture decomposition technique with truncation, discretization, and Li and El Gamal's dyadic decomposition technique [21] to analyze continuous sources. Furthermore, as by-products of our analyses, various lemmas that may be of independent interest are derived, e.g., the "chain rule for coupling" lemma, the (distributed and centralized) Rényi-covering lemmas, the log-concavity invariance lemma, etc.…”
Section: A Main Contributionsmentioning
confidence: 99%
“…where equality in (21) holds if and only if π XY = π X π Y . b) Moreover, assume supp(π XY ) = X ×Y.…”
Section: A Maximal Cross-entropymentioning
confidence: 99%
See 2 more Smart Citations