2014
DOI: 10.1137/120896219
|View full text |Cite
|
Sign up to set email alerts
|

Fast Alternating Direction Optimization Methods

Abstract: Alternating direction methods are a common tool for general mathematical programming and optimization. These methods have become particularly important in the field of variational image processing, which frequently requires the minimization of nondifferentiable objectives. This paper considers accelerated (i.e., fast) variants of two common alternating direction methods: the alternating direction method of multipliers (ADMM) and the alternating minimization algorithm (AMA). The proposed acceleration is of the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

7
622
0
4

Year Published

2016
2016
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 663 publications
(641 citation statements)
references
References 44 publications
7
622
0
4
Order By: Relevance
“…More recent convergence rate analysis of the ADMM still requires at least one of the component functions (f 1 or f 2 ) to be strongly convex and have a Lipschitz continuous gradient. Under these and additional rank conditions on the constraint matrix E, some linear convergence rate results can be obtained for a subset of primal and dual variables in the ADMM algorithm (or its variant); see [6,12,24]. However, when there are more than two blocks involved (K ≥ 3), the convergence (or the rate of convergence) of the ADMM method is unknown, and this has been a key open question for several decades.…”
Section: Alternating Direction Methods Of Multipliers (Admm)mentioning
confidence: 99%
“…More recent convergence rate analysis of the ADMM still requires at least one of the component functions (f 1 or f 2 ) to be strongly convex and have a Lipschitz continuous gradient. Under these and additional rank conditions on the constraint matrix E, some linear convergence rate results can be obtained for a subset of primal and dual variables in the ADMM algorithm (or its variant); see [6,12,24]. However, when there are more than two blocks involved (K ≥ 3), the convergence (or the rate of convergence) of the ADMM method is unknown, and this has been a key open question for several decades.…”
Section: Alternating Direction Methods Of Multipliers (Admm)mentioning
confidence: 99%
“…However, it is possible to accelerate the convergence of both algorithms to reach an optimal O(1/k 2 ) rate of the residuals [25]. Most importantly, these accelerated versions are obtained at a completely negligible cost, except for keeping in memory the fields from the previous iteration.…”
Section: Accelerated Versions Of Augmented Lagrangian Methodsmentioning
confidence: 99%
“…Most importantly, these accelerated versions are obtained at a completely negligible cost, except for keeping in memory the fields from the previous iteration. Following Nesterov's predictor-corrector scheme [25,26], the previous steps (6)-(9) are now replaced by:…”
Section: Accelerated Versions Of Augmented Lagrangian Methodsmentioning
confidence: 99%
See 2 more Smart Citations