The analysis of various models in statistical physics relies on the existence of decompositions of measures into mixtures of product-like components, where the goal is to attain a decomposition into measures whose entropy is close to that of the original measure, yet with small correlations between coordinates. We prove a related general result: For every measure µ on R n and every ε > 0, there exists a decomposition µ = µ θ dm(θ) such that H(µ) − E θ∼m H(µ θ ) ≤ Tr(Cov(µ))ε and E θ∼m Cov(µ θ ) Id/ε. As an application, we derive a general bound for the mean-field approximation of Ising and Potts models, which is in a sense dimension free, in both continuous and discrete settings. In particular, for an Ising model on {±1} n or on [−1, 1] n , we show that the deficit between the mean-field approximation and the free energy is at most C 1+p p n J Sp p 1+p for all p > 0, where J Sp denotes the Schatten-p norm of the interaction matrix. For the case p = 2, this recovers the result of [JKR18], but for an optimal choice of p it often allows to get almost dimension-free bounds.