In this paper, we present an overview of constrained parallel factor (PARAFAC) models where the constraints model linear dependencies among columns of the factor matrices of the tensor decomposition or, alternatively, the pattern of interactions between different modes of the tensor which are captured by the equivalent core tensor. Some tensor prerequisites with a particular emphasis on mode combination using Kronecker products of canonical vectors that makes easier matricization operations, are first introduced. This Kronecker product-based approach is also formulated in terms of an index notation, which provides an original and concise formalism for both matricizing tensors and writing tensor models. Then, after a brief reminder of PARAFAC and Tucker models, two families of constrained tensor models, the co-called PARALIND/CONFAC and PARATUCK models, are described in a unified framework, for Nth-order tensors. New tensor models, called nested Tucker models and block PARALIND/CONFAC models, are also introduced. A link between PARATUCK models and constrained PARAFAC models is then established. Finally, new uniqueness properties of PARATUCK models are deduced from sufficient conditions for essential uniqueness of their associated constrained PARAFAC models.Keywords: Constrained PARAFAC; PARALIND/CONFAC; PARATUCK; Tensor models; Tucker models 1 Review
IntroductionTensor calculus was introduced in differential geometry, at the end of the nineteenth century, and then tensor analysis was developed in the context of Einstein's theory of general relativity, with the introduction of index notation, the so-called Einstein summation convention, at the beginning of the twentieth century, which allows to simplify and shorten physics equations involving tensors. Index notation is also useful for simplifying multivariate statistical calculations, particularly those involving cumulant tensors [1]. Generally speaking, tensors are used in physics and differential geometry for characterizing the properties of a physical system, representing fundamental laws of physics, and defining geometrical objects whose components are functions. When these functions are defined over a continuum of points of a mathematical space, the tensor forms what is called a tensor field, a generalization of vector field used to solve problems involving curved surfaces or spaces, as it is the case of After the first tensor developments by mathematicians and physicists, the need of analyzing collections of data matrices that can be seen as three-way data arrays gave rise to three-way models for data analysis, with the pioneering works of Tucker in psychometrics [3], and Harshman in phonetics [4], who proposed what is now referred to as the Tucker and parallel factor (PARAFAC) decompositions, respectively. The PARAFAC decomposition was independently proposed by Carroll and Chang [5] under the name canonical decomposition (CANDE-COMP) and then called CANDECOMP/PARAFAC (CP) in [6]. For a history of the development of multi-way models in the context ...