2018
DOI: 10.1109/tnnls.2017.2766160
|View full text |Cite
|
Sign up to set email alerts
|

Rank-One Matrix Completion With Automatic Rank Estimation via L1-Norm Regularization

Abstract: Completing a matrix from a small subset of its entries, i.e., matrix completion is a challenging problem arising from many real-world applications, such as machine learning and computer vision. One popular approach to solve the matrix completion problem is based on low-rank decomposition/factorization. Low-rank matrix decomposition-based methods often require a prespecified rank, which is difficult to determine in practice. In this paper, we propose a novel low-rank decomposition-based matrix completion method… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
10
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
9
1

Relationship

1
9

Authors

Journals

citations
Cited by 39 publications
(10 citation statements)
references
References 51 publications
0
10
0
Order By: Relevance
“…On the other hand, matrix completion [36][37][38][39] or feature representation [40,41] models adopt both L 1 and L 2 norms to construct their Penalty, thereby achieving model sparsity [45] or generality [38]. Nonetheless, as mentioned before, Penalty and Loss are two different and critical components of an LF model's learning objective.…”
Section: B Related Workmentioning
confidence: 99%
“…On the other hand, matrix completion [36][37][38][39] or feature representation [40,41] models adopt both L 1 and L 2 norms to construct their Penalty, thereby achieving model sparsity [45] or generality [38]. Nonetheless, as mentioned before, Penalty and Loss are two different and critical components of an LF model's learning objective.…”
Section: B Related Workmentioning
confidence: 99%
“…Based on tensor analysis [29] and sparse regularization, Zhang Yu et al [30] have proposed the L1-regularized multiway canonical correlation analysis (L1-MCCA). L1 regularization [31][32][33] was used to further strengthen MCCA's test channel array optimization so that L1-MCCA has the best projection vector effect when learning reference signal optimization. During the data training process, the L1-MCCA has the stronger ability to learn projection vectors and prevent signal overfitting.…”
Section: Introductionmentioning
confidence: 99%
“…One natural solution is to fill in the missing values and then view the recovered tensors as the extracted features. Many tensor completion techniques have been extended from matrix completion cases [23], [24], which are widely used for predicting missing data given partially observed entries and have drawn much attention in many applications such as image/video recovery [25], [26]. For example, Liu et al [25] defined the Tuckerbased tensor nuclear norm by combining nuclear norms of all matrices unfolded along each mode and proposed a high accuracy low-rank tensor completion algorithm (HaLRTC) for estimating missing values in tensor visual data.…”
Section: Introductionmentioning
confidence: 99%