Recently, two extensions of Wyner's common information -exact and Rényi common informations -were introduced respectively by Kumar, Li, and El Gamal (KLE), and the present authors. The class of common information problems refers to determining the minimum rate of the common input to two independent processors needed to generate an exact or approximate joint distribution. For the exact common information problem, an exact generation of the target distribution is required, while for Wyner's and α-Rényi common informations, the relative entropy and Rényi divergence with order α were respectively used to quantify the discrepancy between the synthesized and target distributions. The exact common information is larger than or equal to Wyner's common information. However, it was hitherto unknown whether the former is strictly larger than the latter. In this paper, we first establish the equivalence between the exact and ∞-Rényi common informations, and then provide single-letter upper and lower bounds for these two quantities. For doubly symmetric binary sources, we show that the upper and lower bounds coincide, which implies that for such sources, the exact and ∞-Rényi common informations are completely characterized. Interestingly, we observe that for such sources, these two common informations are strictly larger than Wyner's. This answers an open problem posed by KLE. Furthermore, we extend Wyner's, ∞-Rényi, and exact common informations to sources with countably infinite or continuous alphabets, including Gaussian sources.
Index TermsWyner's common information, Rényi common information, Exact common information, Exact channel simulation, Exact source simulation, Communication complexity of correlation Recently, the present authors [2], [3] introduced the notion of Rényi common information, which is defined as the minimum common rate when the KL divergence is replaced by more general divergences -the family of Rényi divergences. When s = 0, Rényi common information reduces to Wyner's common information. We proved that for Rényi divergences of order 1 + s ∈ (0, 1], the minimum rate needed to guarantee that the (normalized and unnormalized) Rényi divergences vanish asymptotically is equal to Wyner's common information. However, for Rényi divergences of order 1 + s ∈ (1, 2], we only provided an upper bound, which is larger than Wyner's common information in general. Furthermore, the common information with approximation error measured by the total variation (TV) distance is also equal to Wyner's common information [2], [4], [5]; and exponential achievability and converse results was established in [2], [4], [6].Kumar, Li, and El Gamal (KLE) [5] extended Wyner's common information in a different way. They assumed variable-length codes and exact generation of the correlated sources (X, Y ) ∼ π XY , instead of block codes and approximate simulation of π XY as assumed by Wyner [1] and by us [2], [3]. For such exact generation problem, KLE [5] characterized the minimum common rate, coined exact common information, by T Exac...