Given a non-singular matrix, factorize it into the product of two matrices, and then form a new matrix by multiplying the two factors together in reverse order. Repeat the process and so produce a sequence of matrices, all similar to the given matrix. This is the essence of an LR method. For several factorization techniques the sequence converges, quite generally, to a triangular matrix from which the eigenvalues may be read off.The LR transformations discussed here preserve the form of a Hessenberg matrix (ai 0 if i > j--l) and for such matrices the work involved in the transformation from one N N matrix to the next is proportional to N as against N for full matrices. Using this and other devices the latest versions of the transformation are among the best available methods for finding the eigenvalues of general matrices.The evolution of these algorithms can be traced back to classical function theory, each method being numerically more feasible than its predecessor.