2019
DOI: 10.1007/s12532-019-00173-3
|View full text |Cite
|
Sign up to set email alerts
|

An adaptive primal-dual framework for nonsmooth convex minimization

Abstract: We propose a new self-adaptive, double-loop smoothing algorithm to solve composite, nonsmooth, and constrained convex optimization problems. Our algorithm is based on Nesterov's smoothing technique via general Bregman distance functions. It selfadaptively selects the number of iterations in the inner loop to achieve a desired complexity bound without requiring the accuracy a priori as in variants of Augmented Lagrangian methods (ALM). We prove O 1 k -convergence rate on the last iterate of the outer sequence f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
25
0

Year Published

2019
2019
2025
2025

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 21 publications
(28 citation statements)
references
References 108 publications
2
25
0
Order By: Relevance
“…There is a rich literature on primal-dual algorithms searching for a saddle point of L (see [45] and references therein). In the special case where f = 0, the alternating direction method of multipliers (ADMM) proposed by Glowinsky and Marroco [25], Gabay and Mercier [23] and the algorithm of Chambolle and Pock [12] are amongst the most celebrated ones.…”
Section: Motivationmentioning
confidence: 99%
“…There is a rich literature on primal-dual algorithms searching for a saddle point of L (see [45] and references therein). In the special case where f = 0, the alternating direction method of multipliers (ADMM) proposed by Glowinsky and Marroco [25], Gabay and Mercier [23] and the algorithm of Chambolle and Pock [12] are amongst the most celebrated ones.…”
Section: Motivationmentioning
confidence: 99%
“…As we can see from (8) of Lemma 1 that the bound on dist K (Ax + By − c) depends on λ instead of λ − λ 0 from an initial dual variable λ 0 . We use the idea of "restarting the prox-center point" from [47,48] to adaptively update λ 0 . This idea has been recently used in [38,49] as a restarting strategy and it has significantly improved the performance of the algorithms.…”
Section: Shifting the Initial Dual Variable And Restartingmentioning
confidence: 99%
“…where ∇ϕρ is given by (47). Since proving the convergence of this variant is out of scope of this paper, we refer to our forthcoming work [46] for the full theory of restarting.…”
Section: Shifting the Initial Dual Variable And Restartingmentioning
confidence: 99%
See 2 more Smart Citations