Let X = (Xi) 1≤i≤n be an i.i.d. sample of square-integrable variables in R d , with common expectation µ and covariance matrix Σ, both unknown. We consider the problem of testing if µ is η-close to zero, i.e. µ ≤ η against µ ≥ (η + δ); we also tackle the more general two-sample mean closeness testing problem. The aim of this paper is to obtain nonasymptotic upper and lower bounds on the minimal separation distance δ such that we can control both the Type I and Type II errors at a given level. The main technical tools are concentration inequalities, first for a suitable estimator of µ 2 used a test statistic, and secondly for estimating the operator and Frobenius norms of Σ coming into the quantiles of said test statistic. These properties are obtained for Gaussian and bounded distributions. A particular attention is given to the dependence in the pseudo-dimension d * of the distribution, defined asIn particular, for η = 0, the minimum separation distance is Θ(d 1/4 * Σ ∞ /n), in contrast with the minimax estimation distance for µ, which is Θ(d 1/2 e Σ ∞ /n) (where de := Σ 1 / Σ ∞ ). This generalizes a phenomenon spelled out in particular by Baraud (2002).