Proceedings of the 5th Conference on Innovations in Theoretical Computer Science 2014
DOI: 10.1145/2554797.2554806
|View full text |Cite
|
Sign up to set email alerts
|

Decision trees, protocols and the entropy-influence conjecture

Abstract: Given f : {−1, 1} n → {−1, 1}, define the spectral distribution of f to be the distribution on subsets of [n] in which the set S is sampled with probability f (S) 2 . Then the Fourier Entropy-Influence (FEI) conjecture of Friedgut and Kalai [FK96] states that there is some absoluteHere, H[ f 2 ] denotes the Shannon entropy of f 's spectral distribution, and Inf [f ] is the total influence of f . This conjecture is one of the major open problems in the analysis of Boolean functions, and settling it would have … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
20
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(20 citation statements)
references
References 11 publications
(32 reference statements)
0
20
0
Order By: Relevance
“…We note that the reduction in [53,Proposition E.2] shows that removing the requirement Inf(f ) ≥ 1 from the above inequality will prove the FEI conjecture for all Boolean functions with Inf(f ) ≥ log n. Furthermore, if we could show the FEI conjecture for Boolean functions f where aUC ⊕ (f ) = ω(1) is a slow-growing function of n, again the padding argument in [53] shows that we would be able to establish the FEI conjecture for all Boolean functions.…”
Section: :11mentioning
confidence: 81%
See 2 more Smart Citations
“…We note that the reduction in [53,Proposition E.2] shows that removing the requirement Inf(f ) ≥ 1 from the above inequality will prove the FEI conjecture for all Boolean functions with Inf(f ) ≥ log n. Furthermore, if we could show the FEI conjecture for Boolean functions f where aUC ⊕ (f ) = ω(1) is a slow-growing function of n, again the padding argument in [53] shows that we would be able to establish the FEI conjecture for all Boolean functions.…”
Section: :11mentioning
confidence: 81%
“…Later Wan et al [53] used Shannon's source coding theorem [49] (which characterizes entropy) to establish the FEI conjecture for read-k decision trees for constant k. Using their novel interpretation of the FEI conjecture they also reproved O'Donnell-Tan's composition theorem in an elegant way. Recently, Shalev [47] improved the constant in the FEI inequality for read-k decision trees, and further verified the conjecture when either the influence is too low, or the entropy is too high.…”
Section: Prior Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Conjecture 1.1 was verified for various families of Boolean functions (e.g., symmetric functions [10], random functions [3], read-once formulas [1,9], decision trees of constant average depth [11], read-k decision trees for constant k [11]) but is still open for the class of general Boolean functions.…”
Section: Conjecture 11 ([4]mentioning
confidence: 99%
“…In particular, the average decision tree complexity of the sequence (F m ) m≥0 is unbounded, and thus the construction is not subject to the upper bound on constant average depth decision trees of [11]. Each F m is still computable by a read-once formula, though.…”
Section: Afterthoughtsmentioning
confidence: 99%