We develop a reverse entropy power inequality for convex measures, which may be seen as an affine-geometric inverse of the entropy power inequality of Shannon and Stam. The specialization of this inequality to log-concave measures may be seen as a version of Milman's reverse Brunn-Minkowski inequality. The proof relies on a demonstration of new relationships between the entropy of high dimensional random vectors and the volume of convex bodies, and on a study of effective supports of convex measures, both of which are of independent interest, as well as on Milman's deep technology of M -ellipsoids and on certain information-theoretic inequalities. As a by-product, we also give a continuous analogue of some Plünnecke-Ruzsa inequalities from additive combinatorics. * S. Bobkov is with the Here we recall basic definitions and the characterization of the so-called convex measures.Given −∞ ≤ κ ≤ 1, a probability measure µ on R n is called κ-concave, if it satisfies the Brunn-Minkowski-type inequalityfor all t ∈ (0, 1) and for all Borel measurable sets A, B ⊂ R n with positive measure. When κ = 0, (2.1) describes the class of log-concave measures which thus satisfy µ tA + (1 − t)B ≥ µ(A) t µ(B) 1−t .