We incorporate into the empirical measure P n the auxiliary information given by a finite collection of expectation in an optimal information geometry way. This allows to unify several methods exploiting a side information and to uniquely define an informed empirical measure P I n . These methods are shown to share the same asymptotic properties. Then we study the informed empirical process n(P I n − P ) subject to a true information. We establish the Glivenko-Cantelli and Donsker theorems for P I n under minimal assumptions and we quantify the asymptotic uniform variance reduction. Moreover, we prove that the informed empirical process is more concentrated than the empirical process n(P n − P ) for all large n. Finally, as an illustration of the variance reduction, we apply some of these results to the informed empirical quantiles.