2015
DOI: 10.1137/140971518
|View full text |Cite
|
Sign up to set email alerts
|

On Iteratively Reweighted Algorithms for Nonsmooth Nonconvex Optimization in Computer Vision

Abstract: Natural image statistics indicate that we should use nonconvex norms for most regularization tasks in image processing and computer vision. Still, they are rarely used in practice due to the challenge of optimization. Recently, iteratively reweighed 1 minimization (IRL1) has been proposed as a way to tackle a class of nonconvex functions by solving a sequence of convex 2-1 problems. We extend the problem class to the sum of a convex function and a (nonconvex) nondecreasing function applied to another convex fu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
178
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 171 publications
(179 citation statements)
references
References 82 publications
(165 reference statements)
1
178
0
Order By: Relevance
“…Instead, we would recommend when possible to use majorization-minimization (MM) algorithms, based on convex upper bounds of f . For instance, it is of interest to be able to solve problem (1) for non-convex functions φ and in particular so-called concave penalties such as MCP, SCAD and others; for these formulations, MM schemes requiring to solve a sequence of TV are efficient ( [53]) and can be advantageously combined with cut pursuit, since the latter will leverage the partition of the previous iterate as a warm-start for the next iteration. This is the scheme we use in Section 3.2.…”
Section: Algorithm 1 Cut Pursuitmentioning
confidence: 99%
See 1 more Smart Citation
“…Instead, we would recommend when possible to use majorization-minimization (MM) algorithms, based on convex upper bounds of f . For instance, it is of interest to be able to solve problem (1) for non-convex functions φ and in particular so-called concave penalties such as MCP, SCAD and others; for these formulations, MM schemes requiring to solve a sequence of TV are efficient ( [53]) and can be advantageously combined with cut pursuit, since the latter will leverage the partition of the previous iterate as a warm-start for the next iteration. This is the scheme we use in Section 3.2.…”
Section: Algorithm 1 Cut Pursuitmentioning
confidence: 99%
“…We considered a non-convex counterpart of the total variation, similar to the formulations considered in [51] or [71], but with t → ( + t) 1 2 in lieu of t → |t|. The resulting functional can be minimized locally using a reweighted TV scheme described in [53]. We use our cut pursuit algorithm to solve each reweighted TV problem as it is the fastest implementation.…”
Section: A Merge Effectively Decreases the Value Of The Objective Andmentioning
confidence: 99%
“…We have generalized these results in a more recent paper [12], in which we only impose (C1)-(C4) and include the problematic cases where Θ is nonconvex and the set of stationary points S Θ := {x ∈ R n : ∇Θ(x) = 0} is nondiscrete. (The results in [12] are also stronger than those derived from Theorem 2 in [20], which further require the potentials to have locally Lipschitz continuous derivatives.) However, two limitations remain.…”
mentioning
confidence: 86%
“…Recently, Candes et al [24] proposed the iteratively reweighted algorithm, for solving compressive sensing problems that involve a nonconvex log function instead of an 1 -norm. Furthermore, Ochs et al [25] extended this to the iteratively convex majorization-minimization method, for solving nonsmooth nonconvex optimization problems. Furthermore, they provided various versions of iteratively reweighted algorithms, with convergence analysis under certain conditions.…”
Section: Introductionmentioning
confidence: 99%
“…This further improves the quality of restored images, by preserving fine scales while denoising homogeneous regions. In order to solve nonconvex nonsmooth problems, we utilize the iteratively reweighted algorithm [25] and the alternating direction method of multipliers. This results in fast and efficient algorithms.…”
Section: Introductionmentioning
confidence: 99%