2013
DOI: 10.1137/110854746
|View full text |Cite
|
Sign up to set email alerts
|

Nonconvex TV$^q$-Models in Image Restoration: Analysis and a Trust-Region Regularization--Based Superlinearly Convergent Solver

Abstract: A nonconvex variational model is introduced which contains the q-"norm", q ∈ (0, 1), of the gradient of the image to be reconstructed as the regularization term together with a leastsquares type data fidelity term which may depend on a possibly spatially dependent weighting parameter. Hence, the regularization term in this functional is a nonconvex compromise between the minimization of the support of the reconstruction and the classical convex total variation model. In the discrete setting, existence of a min… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
103
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 90 publications
(105 citation statements)
references
References 39 publications
2
103
0
Order By: Relevance
“…Note that ψ is locally Lipschitz and monotonically increasing, and in the following we shall denote by ∂ψ a subdifferential of ψ. We remark that the consistency of the Huberized stationary points induced by (9.3) towards the stationary points of the original model (9.1) was investigated in the previous work [17,18]. Moreover, the system (9.3) is not differentiable in the classical sense.…”
Section: Numerical Reconstructionsmentioning
confidence: 86%
See 3 more Smart Citations
“…Note that ψ is locally Lipschitz and monotonically increasing, and in the following we shall denote by ∂ψ a subdifferential of ψ. We remark that the consistency of the Huberized stationary points induced by (9.3) towards the stationary points of the original model (9.1) was investigated in the previous work [17,18]. Moreover, the system (9.3) is not differentiable in the classical sense.…”
Section: Numerical Reconstructionsmentioning
confidence: 86%
“…In fact, in [17] Huber regularisation was used with ϕ(t) = t q for q ∈ (0, 1) for algorithmic reasons. For small γ > 0, this is defined as…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…In this work, our choices for H (k,k) θ and H (k+1,k) z (see formulas (18) and (21) below) will be structured Hessian approximations motivated from the iteratively reweighted least-squares (IRLS) method [13,14]. The solution of the resulting subproblem can be interpreted as a regularized Newton step; see [15,16,17].…”
mentioning
confidence: 99%