2020
DOI: 10.36227/techrxiv.12762392.v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Dynamic L1-norm Tucker Tensor Decomposition

Abstract: <p>Tucker decomposition is a standard method for processing multi-way (tensor) measurements and finds many applications in machine learning and data mining, among other fields. When tensor measurements arrive in a streaming fashion or are too many to jointly decompose, incremental Tucker analysis is preferred. In addition, dynamic basis adaptation is desired when the nominal data subspaces change. At the same time, it has been documented that outliers in the data can significantly compromise the performa… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 16 publications
0
2
0
Order By: Relevance
“…Future research can be directed towards extending the novel online BTD algorithm to incorporate constraints and side information [69] and perform completion [66] and DL [11] tasks. Modifications necessary for solving large-scale problems (using, for example, sampling/sketching) [54], [80] or being robust (to outliers) [12], [43] are also worth exploring. Coupled [81], [82] and Bayesian [83] versions should also be developed, greatly widening its applications spectrum.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Future research can be directed towards extending the novel online BTD algorithm to incorporate constraints and side information [69] and perform completion [66] and DL [11] tasks. Modifications necessary for solving large-scale problems (using, for example, sampling/sketching) [54], [80] or being robust (to outliers) [12], [43] are also worth exploring. Coupled [81], [82] and Bayesian [83] versions should also be developed, greatly widening its applications spectrum.…”
Section: Discussionmentioning
confidence: 99%
“…Possible solution approaches include alternating optimization with alternating direction method of multipliers (AO-ADMM) [42]. Incorporating ℓ 1 -norm constraints in dynamic TD allows for rejecting outliers or detecting subspace changes [43]. [44] relies on the well-known in robust principal component analysis (PCA) [12], [45] low rank plus sparse representation model to come up with an online CPD scheme for (ADMM-based) outlier-resistant tracking and completion.…”
Section: Related Workmentioning
confidence: 99%