A unified and generalized framework for a recursive least squares (RLS)-like least mean square (LMS) algorithm is proposed, which adopts the cost function of the RLS to minimize the mean square error. This paper aims to explore, in a systematic way, the corresponding ideas scattered and multiple-time re-invented in the literature, and introduces a unified approach in the same spirit as in [1], which relates LMS with the Kalman filter. The proposed alternative to the RLS is favored when the matrix inversion lemma is not useful, such as the case of multivariate or multichannel data where the input is not a vector. Furthermore, all the derivations are conducted in the quaternion domain and are hence generalizable to complexand real-valued models. The resulting algorithm has a neat form and a similar complexity to the RLS. Through experiments, the method is shown to exhibit performance close to or even better than the RLS algorithm. Other aspects, such as the choice of descent directions and variable stepsizes, are also discussed to support the analysis.