This paper considers the fundamental limit of compressed sensing for i.i.d. signal distributions and i.i.d. Gaussian measurement matrices. Its main contribution is a rigorous characterization of the asymptotic mutual information (MI) and minimum mean-square error (MMSE) in this setting. Under mild technical conditions, our results show that the limiting MI and MMSE are equal to the values predicted by the replica method from statistical physics. This resolves a well-known problem that has remained open for over a decade. arXiv:1607.02524v1 [cs.IT] 8 Jul 2016 I n (δ) = I RS (δ). (ii) The sequence of MMSE functions M n (δ) converges almost everywhere to the replica prediction. In other words, for all continuity points of M RS (δ), lim n→∞ M n (δ) = M RS (δ). Remark 1. The primary contribution of Theorem 1 is for the case where M RS (δ) has a discontinuity. This occurs, for example, in applications such as compressed sensing with sparse priors and CDMA with finite alphabet signaling. For the special case where M RS (δ) is continuous, the validity of the replica prediction can also be established by combining the AMP analysis with the I-MMSE relationship [9]-[13].Remark 2. For a given signal distribution P X the singlecrossing property can be verified by numerically evaluating the replica-MMSE and checking whether it crosses the fixed-point curve more than once.
C. Related WorkThe replica method was developed originally to study meanfield approximations in spin glasses [14], [15]. It was first applied to linear estimation problems in the context of CDMA wireless communication [1], [2], [16], with subsequent work focusing on the compressed sensing directly [3]- [8].Within the context of compressed sensing, the results of the replica method have been proven rigorously in a number of settings. One example is given by message passing on matrices with special structure, such as sparsity [9], [17], [18] or spatial coupling [19]- [21]. However, in the case of i.i.d. matrices, the results are limited to signal distributions with a unique fixed point [10], [12] (e.g., Gaussian inputs [22], [23]). For the special case of i.i.d. matrices with binary inputs, it has also been shown that the replica prediction provides an upper bound for the asymptotic mutual information [24]. Bounds on the locations of discontinuities in the MMSE with sparse priors have also been obtain by analyzing the problem of approximate support recovery [6]- [8].Recent work by Huleihel and Merhav [25] addresses the validity of the replica MMSE directly in the case of Gaussian mixture models, using tools from statistical physics and random matrix theory [26], [27].
D. NotationWe use C to denote an absolute constant and C θ to denote a number that depends on a parameter θ. In all cases, the numbers C and C θ are positive and finite, although their values