2016
DOI: 10.48550/arxiv.1606.05535
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Tensor Ring Decomposition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
171
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 109 publications
(187 citation statements)
references
References 48 publications
0
171
1
Order By: Relevance
“…When λ = 0, (1.1) transforms into tensor completion problem. Unlike the matrix case, there exist different kinds of tensor rank, such as Tucker rank [12], multi-rank and tubal rank [13], tensor train (TT) rank [14], and tensor ring (b) TR decomposition (c) FCTN decomposition (a) TT decomposition (TR) rank [15], which are derived from the corresponding tensor decompositions. In general, the minimization of tensor rank is NP-hard [16], the convex/nonconvex relaxation of tensor rank or the low-rank tensor decomposition is usually used instead of the minimization of tensor rank.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…When λ = 0, (1.1) transforms into tensor completion problem. Unlike the matrix case, there exist different kinds of tensor rank, such as Tucker rank [12], multi-rank and tubal rank [13], tensor train (TT) rank [14], and tensor ring (b) TR decomposition (c) FCTN decomposition (a) TT decomposition (TR) rank [15], which are derived from the corresponding tensor decompositions. In general, the minimization of tensor rank is NP-hard [16], the convex/nonconvex relaxation of tensor rank or the low-rank tensor decomposition is usually used instead of the minimization of tensor rank.…”
Section: Introductionmentioning
confidence: 99%
“…TR decomposition [15] (as shown in Fig. 1.1 (b)) decomposes an N th-order tensor X ∈ R I1×I2ו••×I N into a circular multilinear product of a list of third-order core tensors, and the element-wise form is expressed as…”
Section: Introductionmentioning
confidence: 99%
“…The Tucker-rank has been applied in LRTC problem by minimizing its convex surrogate [14] or non-convex surrogates [15,16]. Moreover, a series of tensor network decomposition-based ranks are proposed, such as tensor train (TT)rank [17], tensor ring (TR)-rank [18], and fully-connected tensor network (FCTN)-rank [19]. All of them have been achieved great success in higher-order LRTC [19][20][21].…”
Section: Introductionmentioning
confidence: 99%
“…Tensor network (TN) [40], [41], one of the powerful tools in physics, is broadly applied to solving large-scale optimization problems and provides graphical computation interpretation. TD models based on TN representation has recently attracted much attention, including tensor train (TT) [42] representation and tensor ring (TR) [43] representation, which can overcome the Curse of Dimensionality [40] with linear storage cost O(N IR 2 ). Lately, TN is exploited to explore more complex topology structures [42], [43], [44] for TD, e.g., Projected Entangled Pair States (PEPS) [45] and Fully-Connected TN (FCTN) [46].…”
mentioning
confidence: 99%
“…TD models based on TN representation has recently attracted much attention, including tensor train (TT) [42] representation and tensor ring (TR) [43] representation, which can overcome the Curse of Dimensionality [40] with linear storage cost O(N IR 2 ). Lately, TN is exploited to explore more complex topology structures [42], [43], [44] for TD, e.g., Projected Entangled Pair States (PEPS) [45] and Fully-Connected TN (FCTN) [46]. However, most of the available high-dimensional data suffer the difficulties of mode-dimension unbalance [47] (e.g., filters in convolution neural networks), and mode-correlation discrepancy (e.g., spatial modes in color image are strongly intercorrelated but their weakly correlated with channel mode).…”
mentioning
confidence: 99%