2000
DOI: 10.1137/s0895479896305696
|View full text |Cite
|
Sign up to set email alerts
|

A Multilinear Singular Value Decomposition

Abstract: We discuss a multilinear generalization of the singular value decomposition. There is a strong analogy between several properties of the matrix and the higher-order tensor decomposition; uniqueness, link with the matrix eigenvalue decomposition, first-order perturbation effects, etc., are analyzed. We investigate how tensor symmetries affect the decomposition and propose a multilinear generalization of the symmetric eigenvalue decomposition for pair-wise symmetric tensors.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

8
2,920
0
13

Year Published

2000
2000
2015
2015

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 3,561 publications
(3,055 citation statements)
references
References 21 publications
8
2,920
0
13
Order By: Relevance
“…In many applications such as dimensionality reduction and feature extraction, several existing Tucker decomposition algorithms consider orthogonality of factors, such as the Higher-Order Singular Value Decomposition (HOSVD) and Higher Order Orthogonal Iteration (HOOI) algorithms [21][22][23][24]. For Nonnegative Tucker Decomposition (NTD), the multiplicative algorithms [1,16,25,26] are natural extensions of multiplicative Nonnegative Matrix Factorization (NMF) algorithms based on minimization of the squared Euclidean distance (Frobenius norm) and the Kullback-Leibler divergence.…”
Section: Introductionmentioning
confidence: 99%
“…In many applications such as dimensionality reduction and feature extraction, several existing Tucker decomposition algorithms consider orthogonality of factors, such as the Higher-Order Singular Value Decomposition (HOSVD) and Higher Order Orthogonal Iteration (HOOI) algorithms [21][22][23][24]. For Nonnegative Tucker Decomposition (NTD), the multiplicative algorithms [1,16,25,26] are natural extensions of multiplicative Nonnegative Matrix Factorization (NMF) algorithms based on minimization of the squared Euclidean distance (Frobenius norm) and the Kullback-Leibler divergence.…”
Section: Introductionmentioning
confidence: 99%
“…Our experiments show the effectiveness of the proposed unbalanced core tensor especially when the target tensor is very unbalanced. We have constructed a collection of basis images for Drosophila gene expression pattern images from stages [11][12]. We plan to analyze the biological significance of the learnt basis images in the future.…”
Section: Resultsmentioning
confidence: 99%
“…A key issue in applying the tucker model is how to construct the core tensor. Lathauwer et al [12] presented the well-known HOSVD (Higher-Order Singular Value Decomposition) factorization based on the SVD algorithm for matrices. Without a non-negative requirement, it forced all factors to be orthogonal so that the core tensor could be computed through a unique and explicit expression.…”
Section: Related Workmentioning
confidence: 99%
“…Now the task is to find {ω r } from the M = R r=1 M r samples of Y Y Y. By writing g r = e jωr e jωr2 · · · e jωrMr T , to align with the presentation in Section 2 we define the rth unfolding of X X X as the transpose version of [19]:…”
Section: Extension To Higher Dimensionmentioning
confidence: 99%
“…It is worthy to point out that the derivation of the weighting matrix and performance analysis are different from those of [18]. Section 3 generalizes the proposed solutions to higher dimensional signals with the use of tensor algebra [19]. Simulation results are included in Section 4 to corroborate the theoretical development and to compare the correlation-based approach with the approximate iterative quadratic ML (AIQML) [15], IMDF [16] and UE [8] algorithms as well as Cramér-Rao lower bound (CRLB) [20].…”
mentioning
confidence: 99%