Let X = (Xi) 1≤i≤n be an i.i.d. sample of square-integrable variables in R d , with common expectation µ and covariance matrix Σ, both unknown. We consider the problem of testing if µ is η-close to zero, i.e. µ ≤ η against µ ≥ (η + δ); we also tackle the more general two-sample mean closeness testing problem. The aim of this paper is to obtain nonasymptotic upper and lower bounds on the minimal separation distance δ such that we can control both the Type I and Type II errors at a given level. The main technical tools are concentration inequalities, first for a suitable estimator of µ 2 used a test statistic, and secondly for estimating the operator and Frobenius norms of Σ coming into the quantiles of said test statistic. These properties are obtained for Gaussian and bounded distributions. A particular attention is given to the dependence in the pseudo-dimension d * of the distribution, defined asIn particular, for η = 0, the minimum separation distance is Θ(d 1/4 * Σ ∞ /n), in contrast with the minimax estimation distance for µ, which is Θ(d 1/2 e Σ ∞ /n) (where de := Σ 1 / Σ ∞ ). This generalizes a phenomenon spelled out in particular by Baraud (2002).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.