2021
DOI: 10.1088/2632-2153/abcb4f
|View full text |Cite
|
Sign up to set email alerts
|

Matrix and tensor completion using tensor ring decomposition with sparse representation

Abstract: Completing a data tensor with structured missing components is a challenging task where the missing components are not distributed randomly but they admit some regular patterns, e.g. missing columns and rows or missing blocks/patches. Many of the existing tensor completion algorithms are not able to handle such scenarios. In this paper, we propose a novel and efficient approach for matrix/tensor completion by applying Hankelization and distributed tensor ring decomposition. Our main idea is first Hankelizing a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
6
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 87 publications
0
6
0
Order By: Relevance
“…In the case of organized missing components, such as missing rows and columns or blocks or patches, the work of finishing a data tensor is made more difficult since these components are not dispersed randomly. Such circumstances are not handled by many of the available tensor completion techniques [Ahmadi et al [ 10 ]]. a. Solver of Tensor Train …”
Section: Introductionmentioning
confidence: 99%
“…In the case of organized missing components, such as missing rows and columns or blocks or patches, the work of finishing a data tensor is made more difficult since these components are not dispersed randomly. Such circumstances are not handled by many of the available tensor completion techniques [Ahmadi et al [ 10 ]]. a. Solver of Tensor Train …”
Section: Introductionmentioning
confidence: 99%
“…The process of recovering (reconstructing) an incomplete data tensor from its partially observed data tensor is called tensor completion. In the past few decades, many algorithms have been developed to solve this problem [5,6,7] due to its importance in several applications such as in recommender systems [8], computer vision [9], chemometrics [10], link prediction [11], etc. Tensor completion algorithms are generally categorized into 1) rank minimization and 2) tensor decomposition techniques in which the concepts of tensor rank and tensor decomposition play key roles in the analysis.…”
Section: Introductionmentioning
confidence: 99%
“…Sparse signal representation (SSR) is used in many applications, such as image and tensor denoising, tensor compression, tensor completion, face and audio signal recognition, blind source separation, inverse synthetic aperture radar (ISAR) image formation and classification, and so on [1][2][3][4][5][6][7][8][9][10][11][12][13][14]. This area of signal processing consists of two fundamental principles, proper selection of basis signals (called atoms) which is known as Dictionary Learning (DL), and introducing efficient methods for computing the sparse representation of the signals over the set of obtained atoms (dictionary) called Sparse Coding (SC).…”
Section: Introductionmentioning
confidence: 99%