2015
DOI: 10.48550/arxiv.1506.03167
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Remarks on the Most Informative Function Conjecture at fixed mean

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(10 citation statements)
references
References 0 publications
0
10
0
Order By: Relevance
“…A weaker version of this result with ρ 0 replaced by a sequence that vanishes as n → ∞ was proven in [15], [17]. In the Gaussian setting, the analogous Courtade-Kumar conjecture and the analogous Li-Médard conjecture were respectively confirmed by Kindler, O'Donnell, and Witmer [18] and by Eldan [19]; these results are also further generalized to the Φ-stability in [18].…”
Section: Introductionmentioning
confidence: 82%
“…A weaker version of this result with ρ 0 replaced by a sequence that vanishes as n → ∞ was proven in [15], [17]. In the Gaussian setting, the analogous Courtade-Kumar conjecture and the analogous Li-Médard conjecture were respectively confirmed by Kindler, O'Donnell, and Witmer [18] and by Eldan [19]; these results are also further generalized to the Φ-stability in [18].…”
Section: Introductionmentioning
confidence: 82%
“…Even in this simple case, determining the most informative function seems to be non-trivial. Further investigations of this problem was done in [46][47][48][49]. In particular, [46] studies a related problem in a continuous setting by considering that X and Y are Gaussian random vectors.…”
Section: Related Workmentioning
confidence: 99%
“…Further investigations of this problem was done in [46][47][48][49]. In particular, [46] studies a related problem in a continuous setting by considering that X and Y are Gaussian random vectors. Recently, Samorodnitsky [50] presented a proof of the conjecture in the high noise regime.…”
Section: Related Workmentioning
confidence: 99%
“…In the memoryless deterministic quantizer case, the quantizer is the sign of the received signal. Using Kindler [23], the sign of the received signal is the optimal one bit memoryless deterministic quantizer, and not necessarily the optimal entropy constrained deterministic quantizer. The curve in Fig.…”
Section: B Demonstrating the Advantages Of The Stochastic Quantizermentioning
confidence: 99%