The class of ℓ q -regularized least squares (LQLS) are considered for estimating β ∈ R p from its n noisy linear observations y = Xβ + w. The performance of these schemes are studied under the high-dimensional asymptotic setting in which the dimension of the signal grows linearly with the number of measurements. In this asymptotic setting, phase transition diagrams (PT) are often used for comparing the performance of different estimators. PT specifies the minimum number of observations required by a certain estimator to recover a structured signal, e.g. a sparse one, from its noiseless linear observations. Although phase transition analysis is shown to provide useful information for compressed sensing, the fact that it ignores the measurement noise not only limits its applicability in many application areas, but also may lead to misunderstandings. For instance, consider a linear regression problem in which n > p and the signal is not exactly sparse. If the measurement noise is ignored in such systems, regularization techniques, such as LQLS, seem to be irrelevant since even the ordinary least squares (OLS) returns the exact solution. However, it is well-known that if n is not much larger than p then the regularization techniques improve the performance of OLS.In response to this limitation of PT analysis, we consider the low-noise sensitivity analysis. We show that this analysis framework (i) reveals the advantage of LQLS over OLS, (ii) captures the difference between different LQLS estimators even when n > p, and (iii) provides a fair comparison among different estimators in high signal-to-noise ratios. As an application of this framework, we will show that under mild conditions LASSO outperforms other LQLS even when the signal is dense. Finally, by a simple transformation we connect our low-noise sensitivity framework to the classical asymptotic regime in which n/p → ∞ and characterize how and when regularization techniques offer improvements over ordinary least squares, and which regularizer gives the most improvement when the sample size is large.1. It reveals certain phenomena that are important in applications and are not captured by PT analysis. For instance, one immediately sees the impact of the regularizer and the magnitudes of the elements of β on the AMSE. Furthermore, these relations are expressed explicitly and can be interpreted easily.2. It provides a bridge between the phase transition analysis proposed in compressed sensing, and the classical large sample-size asymptotic (n/p → ∞). We will discuss some of the implications of this connection for the classical asymptotics in Section 3.3.As a consequence, low noise sensitivity analysis enables us to present a fair comparison among different LQLS, and reveal different factors that affect their performance.