This paper derives sufficient conditions for local recovery of coordinate dictionaries comprising a Kroneckerstructured dictionary that is used for representing Kth-order tensor data. Tensor observations are assumed to be generated from a Kronecker-structured dictionary multiplied by sparse coefficient tensors that follow the separable sparsity model. This work provides sufficient conditions on the underlying coordinate dictionaries, coefficient and noise distributions, and number of samples that guarantee recovery of the individual coordinate dictionaries up to a specified error, as a local minimum of the objective function, with high probability. In particular, the sample complexity to recover K coordinate dictionaries with dimensions m k × p k up to estimation error ε k is shown to be). Index Terms-Dictionary identification, dictionary learning, Kronecker-structured dictionary, sample complexity, sparse representations, tensor data, Tucker decomposition. 1024 × p 2 , and 32 × p 3 , where p 1 , p 2 ≥ 1024, and p 3 ≥ 32. This gives rise to a total of [1024(p 1 + p 2 ) + 32p 3 ] unknown parameters in KS DL, which is significantly smaller than 2 25 p. While such "parameter counting" points to the usefulness of KS DL for tensor data, a fundamental question remains open in the literature: what are the theoretical limits on the learning of KS dictionaries underlying Kth-order tensor data? To answer this question, we examine the KS-DL objective function and find sufficient conditions on the number of samples (or sample complexity) for successful local identification of coordinate dictionaries underlying the KS dictionary. To the best of our knowledge, this is the first work presenting such identification results for the KS-DL problem.
A. Our ContributionsWe derive sufficient conditions on the true coordinate dictionaries, coefficient and noise distributions, regularization parameter, and the number of data samples such that the KS-DL objective function has a local minimum within a arXiv:1712.03471v3 [stat.ML]