2018
DOI: 10.1214/18-aap1384
|View full text |Cite
|
Sign up to set email alerts
|

Dictator functions maximize mutual information

Abstract: Let (X, Y) denote n independent, identically distributed copies of two arbitrarily correlated Rademacher random variables (X, Y). We prove that the inequality I(f (X); g(Y)) ≤ I(X; Y) holds for any two Boolean functions: f, g : {−1, 1} n → {−1, 1} (I( • ; •) denotes mutual information). We further show that equality in general is achieved only by the dictator functions f (x) = ±g(x) = ±xi, i ∈ {1, 2, . . . , n}.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
8
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(8 citation statements)
references
References 7 publications
0
8
0
Order By: Relevance
“…Furthermore, it is sometimes the case that the optimal biclustering problem is easier to solve then the standard single-sided clustering problem. For example, Courtade-Kumar conjecture [13] for the standard single-sided clustering setting was ultimately proved for the biclustering setting [14]. A particular case, where (X, Y) are drawn from DSBS distribution, and the mappings f and g are restricted to be boolean function, was addressed in [14].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Furthermore, it is sometimes the case that the optimal biclustering problem is easier to solve then the standard single-sided clustering problem. For example, Courtade-Kumar conjecture [13] for the standard single-sided clustering setting was ultimately proved for the biclustering setting [14]. A particular case, where (X, Y) are drawn from DSBS distribution, and the mappings f and g are restricted to be boolean function, was addressed in [14].…”
Section: Introductionmentioning
confidence: 99%
“…For example, Courtade-Kumar conjecture [13] for the standard single-sided clustering setting was ultimately proved for the biclustering setting [14]. A particular case, where (X, Y) are drawn from DSBS distribution, and the mappings f and g are restricted to be boolean function, was addressed in [14]. The bound I(f (X n ); g(Y n )) ≤ I(X; Y) was established, which is tight if and only if f and g are dictator functions.…”
Section: Introductionmentioning
confidence: 99%
“…The analogous question in the Gaussian setting was verified by Kindler, O'Donnell and Witmer [12]. Pichler, Piantanida and Matz [18] proved the variant that the dictator function maximizes the mutual information I(f (X); g(Y )) among all Boolean functions f and g. The original conjecture is still wide open. Courtade and Kumar [6] has observed that their conjecture holds in extremal scenarios ǫ = ǫ(n) → 0, 1/2.…”
Section: Introductionmentioning
confidence: 83%
“…Recall our definition of Zj in (18). Under the noise in (27), Zj is a Bernoulli random variable taking 1 and −1 with equal probability.…”
Section: Noise: Type IImentioning
confidence: 99%
“…It was subsequently generalized to the cases of α > 2, α = 1, and 1 < α < 2 in [4], [8], [10], and corresponding conjectures (mentioned above) were posed. In face, a weaker version (the two-function version) of Courtade-Kumar conjecture was solved by Pichler, Piantanida, and Matz [14] by using Fourier analysis. Ordentlich, Shayevitz, and Weinstein [15] proved a new bound for the Courtade-Kumar conjecture which improves the well-known bound ρ 2 and turns out to be asymptotically sharp in the limiting case ρ → 0.…”
Section: Introductionmentioning
confidence: 99%