Abstract-A random variable with distribution is observed in Gaussian noise and is estimated by a mismatched minimum meansquare estimator that assumes that the distribution is , instead of . This paper shows that the integral over all signal-to-noise ratios (SNRs) of the excess mean-square estimation error incurred by the mismatched estimator is twice the relative entropy (in nats). This representation of relative entropy can be generalized to nonreal-valued random variables, and can be particularized to give new general representations of mutual information in terms of conditional means. Inspired by the new representation, we also propose a definition of free relative entropy which fills a gap in, and is consistent with, the literature on free probability.Index Terms-Divergence, free probability, minimum meansquare error (MMSE) estimation, mutual information, relative entropy, Shannon theory, statistics.