2015
DOI: 10.1137/140983410
|View full text |Cite
|
Sign up to set email alerts
|

Estimating a Few Extreme Singular Values and Vectors for Large-Scale Matrices in Tensor Train Format

Abstract: We propose new algorithms for singular value decomposition (SVD) of very large-scale matrices based on a low-rank tensor approximation technique called the tensor train (TT) format. The proposed algorithms can compute several dominant singular values and corresponding singular vectors for large-scale structured matrices given in a TT format. The computational complexity of the proposed methods scales logarithmically with the matrix size under the assumption that both the matrix and the singular vectors admit l… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
24
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 28 publications
(24 citation statements)
references
References 34 publications
(76 reference statements)
0
24
0
Order By: Relevance
“…In particular, we focus on Symmetric Eigenvalue Decomposition (EVD/PCA) and Generalized Eigenvalue Decomposition (GEVD) , Hubig et al, 2015, Huckle and Waldherr, 2012, Kressner et al, 2014a, SVD [Lee and Cichocki, 2015], solutions of overdetermined and undetermined systems of linear algebraic equations [Dolgov andSavostyanov, 2014, Oseledets and, the MoorePenrose pseudo-inverse of structured matrices [Lee and Cichocki, 2016b], and LASSO regression problems [Lee and Cichocki, 2016a]. Tensor networks for extremely large-scale multi-block (multi-view) data are also discussed, especially TN models for orthogonal Canonical Correlation Analysis (CCA) and related Higher-Order Partial Least Squares (HOPLS) problems [Hou, 2017, Hou et al, 2016b, Zhao et al, 2011.…”
Section: Chaptermentioning
confidence: 99%
See 4 more Smart Citations
“…In particular, we focus on Symmetric Eigenvalue Decomposition (EVD/PCA) and Generalized Eigenvalue Decomposition (GEVD) , Hubig et al, 2015, Huckle and Waldherr, 2012, Kressner et al, 2014a, SVD [Lee and Cichocki, 2015], solutions of overdetermined and undetermined systems of linear algebraic equations [Dolgov andSavostyanov, 2014, Oseledets and, the MoorePenrose pseudo-inverse of structured matrices [Lee and Cichocki, 2016b], and LASSO regression problems [Lee and Cichocki, 2016a]. Tensor networks for extremely large-scale multi-block (multi-view) data are also discussed, especially TN models for orthogonal Canonical Correlation Analysis (CCA) and related Higher-Order Partial Least Squares (HOPLS) problems [Hou, 2017, Hou et al, 2016b, Zhao et al, 2011.…”
Section: Chaptermentioning
confidence: 99%
“…The traditional methods for solving eigenvalue problems for a symmetric matrix, A P R IˆI , are prohibitive for very large values of I, say I = 10 15 or higher. This computational bottleneck can be very efficiently dealt with through low-rank tensor approximations, and the last 10 years have witnessed the development of such techniques for several classes of optimization problems, including EVD/PCA and SVD , Kressner et al, 2014a, Lee and Cichocki, 2015. The principle is to represent the cost function in a tensor format; under certain conditions, such as that tensors can be often quite well approximated in a low-rank TT format, thus allowing for low-dimensional parametrization.…”
Section: Tt Network For Computing the Single Smallest Eigenvalue And mentioning
confidence: 99%
See 3 more Smart Citations