2021
DOI: 10.1137/20m1387213
|View full text |Cite
|
Sign up to set email alerts
|

A Stochastic Proximal Alternating Minimization for Nonsmooth and Nonconvex Optimization

Abstract: In this work, we introduce a novel stochastic proximal alternating linearized minimization (PALM) algorithm [6] for solving a class of non-smooth and non-convex optimization problems. Large-scale imaging problems are becoming increasingly prevalent due to the advances in data acquisition and computational capabilities. Motivated by the success of stochastic optimization methods, we propose a stochastic variant of proximal alternating linearized minimization. We provide global convergence guarantees, demonstra… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
2

Relationship

2
7

Authors

Journals

citations
Cited by 15 publications
(7 citation statements)
references
References 21 publications
(4 reference statements)
0
7
0
Order By: Relevance
“…This proves the geometric decay of Γ k in expectation. Similar to Appendix B in (Driggs et al 2021), we also have that the third condition holds in Definition 4. This completes the proof.…”
Section: Proof Of Propositionmentioning
confidence: 86%
“…This proves the geometric decay of Γ k in expectation. Similar to Appendix B in (Driggs et al 2021), we also have that the third condition holds in Definition 4. This completes the proof.…”
Section: Proof Of Propositionmentioning
confidence: 86%
“…1. Alternatively, we may also jointly optimize these parameters via alternating minimization (Bolte et al, 2014;Pock and Sabach, 2016;Driggs et al, 2021). 2.…”
Section: Algorithmic Frameworkmentioning
confidence: 99%
“…To avoid such difficulty, many stochastic algorithms for non-convex problem involving three terms have also been proposed. In (Driggs et al, 2021), the authors presented a stochastic proximal alternating linearized minimization (called SPRING) algorithm which combines a class of variancereduced stochastic gradient estimators. In (Yurtsever et al, 2021), the authors extended DYS algorithm to a stochastic setting where the unbiased stochastic gradient estimators were considered.…”
Section: The Proposed Algorithm and Related Workmentioning
confidence: 99%
“…For solving the problem (2) with three-block structures, we propose a new stochastic alternating algorithm. Compared with the stochastic PALM (or SPRING) (Driggs et al, 2021), the SPRING algorithm solves the problem with H(x, y) being a finite sum in which the full gradients of H(x, y) are approximated using the variance-reduced gradient estimators. Our algorithm mainly focuses on the function G having large scale structure, thus we approximate the full gradient of G by unbiased stochastic gradient estimator.…”
Section: Contributionsmentioning
confidence: 99%