2020
DOI: 10.1007/s11075-020-00876-y
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive total variation and second-order total variation-based model for low-rank tensor completion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
6
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 41 publications
0
6
0
Order By: Relevance
“… can be understood as the proportion of low-frequency information each direction contains in the mixed-frequency map. In order to solve problem (10), we use the proximal alternating optimization (PAO) algorithm [ 26 ], which is guaranteed to converge to a critical point under specific circumstances. where is the objective function in problem (10); is a positive model parameter; and variables with the superscript mean the corresponding variables in the previous iteration.…”
Section: The Proposed Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“… can be understood as the proportion of low-frequency information each direction contains in the mixed-frequency map. In order to solve problem (10), we use the proximal alternating optimization (PAO) algorithm [ 26 ], which is guaranteed to converge to a critical point under specific circumstances. where is the objective function in problem (10); is a positive model parameter; and variables with the superscript mean the corresponding variables in the previous iteration.…”
Section: The Proposed Methodsmentioning
confidence: 99%
“…According to [ 19 ], the unique solution of Equation ( 14 ) can be efficiently found by the conjugate gradient (CG) algorithm [ 26 ].…”
Section: The Proposed Methodsmentioning
confidence: 99%
“…The global convergence of the proposed proximal alternate minmization (PAM) algorithm for addressing the proposed model was also proved. Li [11] et al introduced the adaptive total variation and the second-order total variation-based model, which could effectively preserve the local smoothness and alleviate the staircase effect. Some other results can be found in [12,13].…”
Section: Introductionmentioning
confidence: 99%
“…But second-order TV may lead to over smooth for the recovered images and cannot preserve details and edges well. Therefore, Li et al [18] proposed the parallel matrix factorization model for LRTC based on combing the first-order and second-order regularizations.…”
mentioning
confidence: 99%
“…For the above-mentioned existing methods, some methods only consider the removal of mixed noise without missing entries, such as [43,1,35], which cannot perform well with partial observations. The model proposed in [18] combines the advantages of the first-order TV and the secondorder TV, but it mainly pays attention to cases with missing data and does not consider mixed noise removal. TNTV [26] considers the recovery with missing entries and mixed noises, but using only first-order TV easily leads to undesirable staircase effects.…”
mentioning
confidence: 99%