2016
DOI: 10.1080/03081087.2016.1234578
|View full text |Cite
|
Sign up to set email alerts
|

On best rank-2 and rank-(2,2,2) approximations of order-3 tensors

Abstract: It is well known that a best rank-R approximation of order-3 tensors may not exist for R ≥ 2. A best rank-(R, R, R) approximation always exists, however, and is also a best rank-R approximation when it has rank (at most) R. For R = 2 and real order-3 tensors it is shown that a best rank-2 approximation is also a local minimum of the best rank-(2,2,2) approximation problem. This implies that if all rank-(2,2,2) minima have rank larger than 2, then a best rank-2 approximation does not exist. This provides an eas… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
3
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 45 publications
1
3
0
Order By: Relevance
“…The following theorem shows that case (b) cannot happen. This was proven for tensors by Stegeman and Friedland [18,Lemma 3.4]. We generalize their result to arbitrary varieties.…”
Section: Projective Varietiessupporting
confidence: 54%
See 1 more Smart Citation
“…The following theorem shows that case (b) cannot happen. This was proven for tensors by Stegeman and Friedland [18,Lemma 3.4]. We generalize their result to arbitrary varieties.…”
Section: Projective Varietiessupporting
confidence: 54%
“…This vanishes on τ (X). For any pair (µ, ν) as in (18), we write f (µ,ν) for the unique preimage of (22) under the map C[x] → C[a, b]. This is well-defined by Proposition 5.2.…”
Section: The Tangential Variety Of the Veronesementioning
confidence: 99%
“…Characterizing the rank of a tensor is a complex mathematical problem without a simple solution (Alexeev, Forbes, & Tsimerman, 2011; Ballico, Bernardi, Chiantini, & Guardo, 2018; Kolda & Bader, 2009; Stegeman & Friedland, 2017). Therefore, we attempt tensor decomposition beginning by selecting a single component, and increasing the number of components until the algorithm consistently converges.…”
Section: Methodsmentioning
confidence: 99%
“…When the constraints are given by polynomial equations these problems may be solved using methods from algebraic geometry, techniques for this have been developed by several authors [13,19]. In mathematics the algebraic geometric ED problem have been studied in the contexts of low rank tensor and low rank matrix approximation, see for example [19,21]. In systems biology, as discussed above, the ED problem has been used to study the model selection problem in [9].…”
Section: Introductionmentioning
confidence: 99%