The Kaczmarz and Gauss-Seidel methods aim to solve a m × n linear system Xβ = y by iteratively refining the solution estimate; the former uses random rows of X to update β given the corresponding equations and the latter uses random columns of X to update corresponding coordinates in β. Recent work analyzed these algorithms in a parallel comparison for the overcomplete and undercomplete systems, showing convergence to the ordinary least squares (OLS) solution and the minimum Euclidean norm solution respectively. This paper considers the natural follow-up to the OLS problem -ridge or Tikhonov regularized regression. By viewing them as variants of randomized coordinate descent, we present variants of the randomized Kaczmarz (RK) and randomized Gauss-Siedel (RGS) for solving this system and derive their convergence rates. We prove that a recent proposal, which can be interpreted as randomizing over both rows and columns, is strictly suboptimal -instead, one should always work with randomly selected columns (RGS) when m > n (#rows > #cols) and with randomly selected rows (RK) when n > m (#cols > #rows). * Authors are listed in alphabetical order. arXiv:1507.05844v2 [math.NA]