Using the matrix product state (MPS) representation of the recently proposed tensor ring decompositions, in this paper we propose a tensor completion algorithm, which is an alternating minimization algorithm that alternates over the factors in the MPS representation. This development is motivated in part by the success of matrix completion algorithms that alternate over the (low-rank) factors. In this paper, we propose a spectral initialization for the tensor ring completion algorithm and analyze the computational complexity of the proposed algorithm. We numerically compare it with existing methods that employ a low rank tensor train approximation for data completion and show that our method outperforms the existing ones for a variety of real computer vision settings, and thus demonstrate the improved expressive power of tensor ring as compared to tensor train.
I. INTRODUCTIONTensor decompositions for representing and storing data have recently attracted considerable attention due to their effectiveness in compressing data for statistical signal processing [1]- [5]. In this paper we focus on Tensor Ring (TR) decomposition [6] and in particular its relation to Matrix Product States (MPS) [7] representation for tensor representation and use it for completing data from missing entries. In this context our algorithm is motivated by recent work in matrix completion where under a suitable initialization an alternating minimization algorithm [8], [9] over the low rank factors is able to accurately predict the missing data.Recently, tensor networks, considered as the generalization of tensor decompositions, have emerged as the potentially powerful tools for analysis of large-scale tensor data [7]. The most popular tensor network is the Tensor Train (TT) representation, which for a order-d tensor with each dimension of size n requires O(dnr 2 ) parameters, where r is the rank of each of the factors, and thus allows for the efficient data representation [10]. Tensor completion based on tensor train decompositions have been recently considered in [11], [12]. The authors of [11] considered the completion of data based on the alternating least square method.Although the TT format has been widely applied in numerical analysis, its applications to image classification and completion are rather limited [4], [11], [12]. As outlined in [6], TT decomposition suffers from the following limitations. Namely, (i) TT model requires rank-1 constraints on the border factors, (ii) TT ranks are typically small for near-border factors and large for the middle factors, and (iii) the multiplications of the TT factors are not permutation invariant. In order to alleviate those drawbacks, a tensor ring (TR) decomposition has been proposed in [6]. TR decomposition removes the unit rank constraints for the boundary tensor factors and utilizes a