2020
DOI: 10.1007/s10589-020-00183-1
|View full text |Cite
|
Sign up to set email alerts
|

Accelerating incremental gradient optimization with curvature information

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(3 citation statements)
references
References 19 publications
0
3
0
Order By: Relevance
“…Because of asynchrony, the right-hand side of the inequalities will involve delayed versions of V k and W k that perturb the convergence of the synchronous iteration. For example, in the analysis of the incremental aggregated gradient method [27], accelerated incremental aggregated gradient method with curvature information [67], asynchronous quasi-Newton method [21], and asynchronous forward-backward method for solving monotone inclusion problems [60], one can establish iterate relationships on the form…”
Section: Novel Sequence Results For Asynchronous Iterationsmentioning
confidence: 99%
“…Because of asynchrony, the right-hand side of the inequalities will involve delayed versions of V k and W k that perturb the convergence of the synchronous iteration. For example, in the analysis of the incremental aggregated gradient method [27], accelerated incremental aggregated gradient method with curvature information [67], asynchronous quasi-Newton method [21], and asynchronous forward-backward method for solving monotone inclusion problems [60], one can establish iterate relationships on the form…”
Section: Novel Sequence Results For Asynchronous Iterationsmentioning
confidence: 99%
“…Extensions of PIAG that allow for a non-smooth regularizer include [2,3,12] for convex f and [13,14] for non-convex f . In addition, a recent work [15] compensates for the information delays in PIAG using Hessian information. However, all these papers use an upper bound of the worst-case delay to determine the step-size.…”
Section: Algorithms and Related Workmentioning
confidence: 99%
“…In addition, because τ k ∈ [0, k] ∀k ∈ N 0 , τ 0 = 0 and γ 0 = αγ ′ . Substituting these into (47) yields (15).…”
Section: Proof Of Theoremmentioning
confidence: 99%