2021
DOI: 10.1016/j.neunet.2020.09.021
|View full text |Cite
|
Sign up to set email alerts
|

Low Rank Regularization: A review

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
36
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 95 publications
(82 citation statements)
references
References 136 publications
0
36
0
Order By: Relevance
“…Due to the incomplete or corrupted observation of the low-rank matrix, many of previous attempts focus on the low-rank matrix recovery problem [10]. Hu et al [13] provides a comprehensive survey for Low Rank Regularization (LRR), mainly analysing the effect of using LRR as loss function by nuclear norm or other tools. Cui et al [8] and Xiong et al [37] apply nuclear norm and weighted nuclear norm as loss functions to minimize rank of matrix for image classification task respectively, and both of them achieve superior results.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Due to the incomplete or corrupted observation of the low-rank matrix, many of previous attempts focus on the low-rank matrix recovery problem [10]. Hu et al [13] provides a comprehensive survey for Low Rank Regularization (LRR), mainly analysing the effect of using LRR as loss function by nuclear norm or other tools. Cui et al [8] and Xiong et al [37] apply nuclear norm and weighted nuclear norm as loss functions to minimize rank of matrix for image classification task respectively, and both of them achieve superior results.…”
Section: Related Workmentioning
confidence: 99%
“…As each column of the matrix Z represents an encoded state, the rank(Z) can be used to represent the diversity within the matrix, due to the fact that higher rank(Z) denotes larger linear irrelevance among the encoded states. Unlike previous studies and applications based on the rank of matrix [11,8,13,37] employing the lowrank modeling, on the contrary, we creatively use the rank by maximizing rank(Z) to enlarge the exploration diversity, which encourages the agent to visit more different states of high diversity. Thus, the intuition of our intrinsic reward can be achieved by: max Z rank(Z).…”
Section: Nuclear-norm Maximization-based Intrinsic Rewardsmentioning
confidence: 99%
“…It is well-known that matrix rank minimization is an NP-hard problem. Nuclear norm minimization, as a relaxation to the low-rank regularizer, is a common alternative to directly calculating the rank of the matrix [41], [42]. The rank of X is replaced by its nuclear norm, the problem (5) is represented as follows:…”
Section: B the Proposed Sslrsu Modelmentioning
confidence: 99%
“…Another approach that we consider is the greedy heuristic of atomic decomposition for minimum rank approximation (admira) proposed by Lee and Bresler [14], which aims to find k rank-one matrices representing the original matrix. Finally, in addition to nuclear, we consider schatten approach proposed by Mohan and Fazel [17] to solve the Schatten p-norm minimization problem with p = 1/2, one of the non-convex relaxation methods for which k is a priori unknown (see, e.g., Hu et al [11]). The tolerance is again set at = 10 −6 for X s+1 − X s F and the maximum number of iterations is set higher atN max = 10000 for these methods given that their (least-squares) subproblems require less computational efforts.…”
Section: Recoverabilitymentioning
confidence: 99%