2019
DOI: 10.48550/arxiv.1908.11482
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Anderson Accelerated Douglas-Rachford Splitting

Anqi Fu,
Junzi Zhang,
Stephen Boyd

Abstract: We consider the problem of non-smooth convex optimization with general linear constraints, where the objective function is only accessible through its proximal operator. This problem arises in many different fields such as statistical learning, computational imaging, telecommunications, and optimal control. To solve it, we propose an Anderson accelerated Douglas-Rachford splitting (A2DR) algorithm, which we show either globally converges or provides a certificate of infeasibility/unboundedness under very mild … Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
22
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(22 citation statements)
references
References 24 publications
(34 reference statements)
0
22
0
Order By: Relevance
“…However, AA and optimization algorithms have been developed quite independently and only limited connections were discovered and studied [16,18]. Very recently, the technique has started to gain a significant interest in the optimization community (see, e.g., [47,46,5,53,19,39]). Specifically, a series of papers [47,46,5] adapt AA to accelerate several classical algorithms for unconstrained optimization; [53] studies a variant of AA for non-expansive operators; [19] proposes an application of AA to Douglas-Rachford splitting; and [39] uses AA to improve the performance of the ADMM method.…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…However, AA and optimization algorithms have been developed quite independently and only limited connections were discovered and studied [16,18]. Very recently, the technique has started to gain a significant interest in the optimization community (see, e.g., [47,46,5,53,19,39]). Specifically, a series of papers [47,46,5] adapt AA to accelerate several classical algorithms for unconstrained optimization; [53] studies a variant of AA for non-expansive operators; [19] proposes an application of AA to Douglas-Rachford splitting; and [39] uses AA to improve the performance of the ADMM method.…”
Section: Related Workmentioning
confidence: 99%
“…Very recently, the technique has started to gain a significant interest in the optimization community (see, e.g., [47,46,5,53,19,39]). Specifically, a series of papers [47,46,5] adapt AA to accelerate several classical algorithms for unconstrained optimization; [53] studies a variant of AA for non-expansive operators; [19] proposes an application of AA to Douglas-Rachford splitting; and [39] uses AA to improve the performance of the ADMM method. There is also an emerging literature on applications of AA in machine learning [23,28,20,36].…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations