2023
DOI: 10.1109/lsp.2023.3301244
|View full text |Cite
|
Sign up to set email alerts
|

Robust Low-Rank Matrix Recovery as Mixed Integer Programming via $\ell _{0}$-Norm Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 25 publications
0
1
0
Order By: Relevance
“…To realize the rule selection (RS), the Group Lasso regularization is added to the objective function. Compared with other Lasso regularization [35][36][37], it can induce row or column sparsity, thus producing sparsity of rules in a grouped manner, which provides the possibility for rule selection [38]. Therefore, the following objective function of each SFNN-1 contains two parts, that is, the mean square error (MSE) and the Group Lasso penalty term:…”
Section: First-order Sparse Tsk Nonstationary Fuzzy Neural Network (S...mentioning
confidence: 99%
“…To realize the rule selection (RS), the Group Lasso regularization is added to the objective function. Compared with other Lasso regularization [35][36][37], it can induce row or column sparsity, thus producing sparsity of rules in a grouped manner, which provides the possibility for rule selection [38]. Therefore, the following objective function of each SFNN-1 contains two parts, that is, the mean square error (MSE) and the Group Lasso penalty term:…”
Section: First-order Sparse Tsk Nonstationary Fuzzy Neural Network (S...mentioning
confidence: 99%