2022
DOI: 10.1109/tcyb.2021.3084931
|View full text |Cite
|
Sign up to set email alerts
|

Nonconvex Structural Sparsity Residual Constraint for Image Restoration

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(13 citation statements)
references
References 76 publications
0
13
0
Order By: Relevance
“…where šœ† 1 > 0 and šœ† 2 ā‰„ 0 are the tradeoff parameters for group sparsity prior and side prior, respectively. If šœ† 2 = 0, problem (15) is equal to problem (8). Instead of separately executing problem (15) on each group of similar patches, it can be extended and accomplished on all patches of Y.…”
Section: Formulation Of Side Group Sparse Coding Modelmentioning
confidence: 99%
See 2 more Smart Citations
“…where šœ† 1 > 0 and šœ† 2 ā‰„ 0 are the tradeoff parameters for group sparsity prior and side prior, respectively. If šœ† 2 = 0, problem (15) is equal to problem (8). Instead of separately executing problem (15) on each group of similar patches, it can be extended and accomplished on all patches of Y.…”
Section: Formulation Of Side Group Sparse Coding Modelmentioning
confidence: 99%
“…If šœ† 2 = 0, problem (15) is equal to problem (8). Instead of separately executing problem (15) on each group of similar patches, it can be extended and accomplished on all patches of Y. Let Z āˆˆ ā„ nƗN be all patches of Y, and ššŖ be the similarity matrix for all patches.…”
Section: Formulation Of Side Group Sparse Coding Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…Evidence demonstrates that image priors are the foundation for image restoration, including total variation (TV) [5][6][7], sparsity [2,8], low-rank [9][10][11], and deep image prior [12][13][14][15][16][17][18][19][20]. Particularly, sparsity prior is considered as one of the most remarkable for natural images [2,8,[21][22][23][24]. On the basis of the strategies for manipulating sparsity prior, current algorithms are roughly divided into two classes, that is, patch- [2,25,26] and group-based approaches [8,22,[27][28][29], where the former ones independently perform image restoration for each patch, and the latter ones execute restoration task for each group of patches.…”
Section: Introductionmentioning
confidence: 99%
“…To tackle this problem, the residual models [8,22] assume that the group of similar patches exist the truth representation, where the learned representation should not deviate from the truth one. In contrast to GSR, the residual model is much more difficult to train since it needs to estimate the truth representation, which is also the significant difference among various algorithms.…”
Section: Introductionmentioning
confidence: 99%