“…Initially motivated for tensor factorization, use of the t-product has become prominent in the tensor and signal processing community. Under the t-product, tensors enjoy a linear algebraic-like framework that has proved useful in applications such as dictionary learning [33,38], low-rank tensor completion [41,32,39,40], facial recognition [11], and neural networks [27,36]. The process of naively transforming high-order tensors into two dimensional arrays via a flattening or unfolding process is often referred to as "matricization".…”