2013 IEEE International Symposium on Information Theory 2013
DOI: 10.1109/isit.2013.6620221
|View full text |Cite
|
Sign up to set email alerts
|

Which Boolean functions are most informative?

Abstract: We introduce a simply stated conjecture regarding the maximum mutual information a Boolean function can reveal about noisy inputs. Specifically, let X n be i.i.d. Bernoulli(1/2), and let Y n be the result of passing X n through a memoryless binary symmetric channel with crossover probability α. For any Boolean function b : {0, 1} n → {0, 1}, we conjecture that H(α). While the conjecture remains open, we provide substantial evidence supporting its validity.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 18 publications
(21 citation statements)
references
References 7 publications
0
21
0
Order By: Relevance
“…We note that a bound similar to (16) below has appeared in the context of bounding the minmax decision risk in [ 25, (3.4)]. However, the proof technique used in [25] does not lead to the general bound presented in Theorem 3.…”
Section: Remark 2 Ifmentioning
confidence: 97%
See 1 more Smart Citation
“…We note that a bound similar to (16) below has appeared in the context of bounding the minmax decision risk in [ 25, (3.4)]. However, the proof technique used in [25] does not lead to the general bound presented in Theorem 3.…”
Section: Remark 2 Ifmentioning
confidence: 97%
“…More recently, Kumar and Courtade [16] investigated boolean functions in an information-theoretic context. In particular, they analyzed which is the most informative (in terms of mutual information) 1-bit function (i.e.…”
Section: Introductionmentioning
confidence: 99%
“…One can also show that the condition holds for source distributions corresponding to the input-output pair resulting from a uniformly distributed input into a binary input symmetric output channel. The above ideas suggest that for a recent conjecture regarding Boolean functions [35], hypercontractivity is going to be a more useful tool than maximal correlation. Indeed, evidence for this can be found in [32], where usage of s * helps in an automated proof of an inequality that cannot be proved using maximal correlation.…”
Section: Remarkmentioning
confidence: 99%
“…Suppose we want to provide constraints on the space of possible joint distributions that can be created by Boolean functions b, b : {0, 1} n → {0, 1} as (W, Z) = (b(X n ), b (Y n )), for some n. This problem arises for instance, when attacking the following weaker version of a conjecture of Kumar and Courtade [24] that was considered in [25]. Conjecture 1.…”
Section: B Non-interactive Simulation Of Joint Distributions Using Bmentioning
confidence: 99%
“…3.4] which considered the doubly symmetric binary source distribution (X, Y ). Since the reverse hypercontractivity region for this joint distribution is explicitly known [see (24)], they showed by suitable choices of C n , D n that (29) cannot be improved in terms of the exponents on the right hand side, thus establishing (29) (with optimization of exponents with knowledge of marginal probabilities on the right) as an isoperimetric inequality.…”
Section: Isoperimetric Inequalitiesmentioning
confidence: 99%