We establish the stability near a Euclidean ball of two conjectured inequalities: the dimensional Brunn-Minkowski inequality for radially symmetric log-concave measures in R n , and of the log-Brunn-Minkowski inequality.
Abstract:We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifically, we study the rate-distortion function for log-concave sources and distortion measure d(x,x) = |x −x| r , with r ≥ 1, and we establish that the difference between the rate-distortion function and the Shannon lower bound is at most log( √ πe) ≈ 1.5 bits, independently of r and the target distortion d. For mean-square error distortion, the difference is at most log( πe 2 ) ≈ 1 bit, regardless of d. We also provide bounds on the capacity of memoryless additive noise channels when the noise is log-concave. We show that the difference between the capacity of such channels and the capacity of the Gaussian channel with the same noise power is at most log( πe 2 ) ≈ 1 bit. Our results generalize to the case of a random vector X with possibly dependent coordinates. Our proof technique leverages tools from convex geometry.
An extension of the entropy power inequality to the form N α r (X +Y ) ≥ N α r (X)+N α r (Y ) with arbitrary independent summands X and Y in R n is obtained for the Rényi entropy and powers α ≥ (r + 1)/2. Index Terms-Entropy power inequality, Rényi entropy.
Let us define for a compact set A ⊂ R n the sequence A(k) = a 1 + · · · + a k k : a 1 , . . . , a k ∈ A = 1 k A + · · · + A k times.It was independently proved by Shapley, Folkman and Starr (1969) and by Emerson and Greenleaf (1969) that A(k) approaches the convex hull of A in the Hausdorff distance induced by the Euclidean norm as k goes to ∞. We explore in this survey how exactly A(k) approaches the convex hull of A, and more generally, how a Minkowski sum of possibly different compact sets approaches convexity, as measured by various indices of non-convexity. The non-convexity indices considered include the Hausdorff distance induced by any norm on R n , the volume deficit (the difference of volumes), a nonconvexity index introduced by Schneider (1975), and the effective standard deviation or inner radius. After first clarifying the interrelationships between these various indices of non-convexity, which were previously either unknown or scattered in the literature, we show that the volume deficit of A(k) does not monotonically decrease to 0 in dimension 12 or above, thus falsifying a conjecture of Bobkov et al. (2011), even though their conjecture is proved to be true in dimension 1 and for certain sets A with special structure. On the other hand, Schneider's index possesses a strong monotonicity property along the sequence A(k), and both the Hausdorff distance and effective standard deviation are eventually monotone (once k exceeds n). Along the way, we obtain new inequalities for the volume of the Minkowski sum of compact sets (showing that this is fractionally superadditive but not supermodular in general, but is indeed supermodular when the sets are convex), falsify a conjecture of Dyn and Farkhi (2004), demonstrate applications of our results to combinatorial discrepancy theory, and suggest some questions worthy of further investigation.
Let us define, for a compact set $A \subset \mathbb{R}^n$, the Minkowski averages of $A$: $$ A(k) = \left\{\frac{a_1+\cdots +a_k}{k} : a_1, \ldots, a_k\in A\right\}=\frac{1}{k}\Big(\underset{k\ {\rm times}}{\underbrace{A + \cdots + A}}\Big). $$ We study the monotonicity of the convergence of $A(k)$ towards the convex hull of $A$, when considering the Hausdorff distance, the volume deficit and a non-convexity index of Schneider as measures of convergence. For the volume deficit, we show that monotonicity fails in general, thus disproving a conjecture of Bobkov, Madiman and Wang. For Schneider's non-convexity index, we prove that a strong form of monotonicity holds, and for the Hausdorff distance, we establish that the sequence is eventually nonincreasing.Comment: 6 pages, including figures. Contains announcement of results that will be part of a more comprehensive, forthcoming paper. Version 2 corrects a typ
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.