2017
DOI: 10.1007/s10915-017-0621-6
|View full text |Cite
|
Sign up to set email alerts
|

On the Information-Adaptive Variants of the ADMM: An Iteration Complexity Perspective

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
95
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 69 publications
(95 citation statements)
references
References 34 publications
0
95
0
Order By: Relevance
“…Duchi et al [10] proposed a stochastic zeroth-order mirror descent based method for convex and strongly convex functions and proved an O(1/ √ R) rate for a zeroth-order stochastic gradient method on convex objectives. Gao et al [11] proposed a stochastic gradient ADMM method that allows only noisy estimations of function values to be accessible and they proved an O(1/R) rate, under some requirements for smoothing parameter and batch size. Recently, Lian et al [12] proposed an asynchronous stochastic optimization algorithm with zeroth-order methods and proved a convergence rate of O(1/ √ R).…”
Section: A Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Duchi et al [10] proposed a stochastic zeroth-order mirror descent based method for convex and strongly convex functions and proved an O(1/ √ R) rate for a zeroth-order stochastic gradient method on convex objectives. Gao et al [11] proposed a stochastic gradient ADMM method that allows only noisy estimations of function values to be accessible and they proved an O(1/R) rate, under some requirements for smoothing parameter and batch size. Recently, Lian et al [12] proposed an asynchronous stochastic optimization algorithm with zeroth-order methods and proved a convergence rate of O(1/ √ R).…”
Section: A Related Workmentioning
confidence: 99%
“…Lemma II.1. [11] Suppose that G µ (x r , v r , ξ r ) is defined as in (7), and assumptions A.1 and A.2 hold. Then…”
Section: Algorithm Developmentmentioning
confidence: 99%
“…Another way to handle the non-diagonal F and the expected objective function E [l(x, ξ)] is stochastic ADMM-like methods [16,25,8,19,22,2,26,20] which aim for solving the following problem after introducing an additional variable z: min x∈X ,z=F x l(x) + r(z),…”
Section: Introductionmentioning
confidence: 99%
“…Ouyang et al [16], Suzuki [19], Azadi and Sra [2], Gao et al [8], and recently Zhao et al [25] developed several stochastic variants of ADMM, which linearize l by using its noisy subgradient or gradient and add a varying proximal term. Furthermore, Zhong and Kwok [26] and Suzuki [20] respectively proposed a stochastic averaged gradient-based ADMM and a stochastic dual coordinate ascent ADMM, which can both obtain improved iteration complexities.…”
Section: Introductionmentioning
confidence: 99%
“…The authors of monographs [14], [15] summarized several classes of derivativefree methods, including both deterministic and stochastic cases. In addition, there were some other algorithms for solving convex optimization using derivative-free or zeroth-order information, including the zeroth-order mirror decent algorithm [18], the zeroth-order ADMM algorithm [19], the Kiefer-Wolfowitz (KW) algorithm [20], and the Nelder-Mead algorithm [22]. For the Nelder-Mead algorithm, introduced in [22], to the best of our knowledge, theoretical properties are still under investigation.…”
mentioning
confidence: 99%