In this paper, we address the robustness, in the sense of l 2 -stability, of the set-membership normalized least-meansquare (SM-NLMS) and the set-membership affine projection (SM-AP) algorithms. For the SM-NLMS algorithm, we demonstrate that it is robust regardless of the choice of its parameters and that the SM-NLMS enhances the parameter estimation in most of the iterations in which an update occurs, two advantages over the classical NLMS algorithm. Moreover, we also prove that if the noise bound is known, then we can set the SM-NLMS so that it never degrades the estimate. As for the SM-AP algorithm, we demonstrate that its robustness depends on a judicious choice of one of its parameters: the constraint vector (CV). We prove the existence of CVs satisfying the robustness condition, but practical choices remain unknown. We also demonstrate that both the SM-AP and SM-NLMS algorithms do not diverge, even when their parameters are selected naively, provided the additional noise is bounded. Numerical results that corroborate our analyses are presented.