Consider the linear regression model y = pol + xp + 6 in the usual notation. It is argued that the class of ordinary ridge estimators obtained by shrinking the least squares estimator by the matrix (X'X + kI)-'X'X is sensitive to outliers in the yvariable. To overcome this problem, we propose a new class of ridgetype M-estimators, obtained by shrinking an M-estimator (instead of the least squares estimator) by the same matrix. Since the optimal value of the ridge parameter k is unknown, we suggest a procedure for choosing it a d a p tively. In a reasonably large scale simulation study with a particular M-estimator, we found that if the conditions are such that the Mestimator is more efficient than the least squares estimator then the corresponding ridgetype M-estimator proposed here is better, in terms of a Mean Squared Error criteria, than the ordinary ridge estimator with k chosen suitably. An example illustrates that the estimators proposed here are less sensitive to outliers in the y-variable than ordinary ridge estimators.