Low-rank tensor recovery in the presence of sparse but arbitrary errors is an important problem with many practical applications. In this work, we propose a general framework that recovers low-rank tensors, in which the data can be deformed by some unknown transformation and corrupted by arbitrary sparse errors. We present a unified presentation of the surrogate-based formulations that incorporate the feacture of rectification and alignment simultaneously, and establish worst-case error bounds of the recovered tensor. In this context, the state-of-the-art methods "RASL" and "TILT" can be viewed as two special cases of our work, and yet each only performs part of the function of our method. Subsequently, we study the optimization aspects of the problem in detail by deriving two algorithms, one based on ADMM and the other based on proximal gradient. We provide global optimality convergence guarantees for the latter algorithm, and demonstrate the performance of the former through in-depth simulations. Finally, we present extensive experimental results on public datasets to demonstrate the efficacy of our proposed framework and algorithms.