2023
DOI: 10.1109/tnnls.2022.3170944
|View full text |Cite
|
Sign up to set email alerts
|

Distributed Stochastic Gradient Tracking Algorithm With Variance Reduction for Non-Convex Optimization

Abstract: This paper considers a distributed stochastic nonconvex optimization problem, where the nodes in a network cooperatively minimize a sum of L-smooth local cost functions with sparse gradients. By adaptively adjusting the stepsizes according to the historical (possibly sparse) gradients, a distributed adaptive gradient algorithm is proposed, in which a gradient tracking estimator is used to handle the heterogeneity between different local cost functions. We establish an upper bound on the optimality gap, which i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2025
2025

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 66 publications
(65 reference statements)
0
1
0
Order By: Relevance
“…Li et al (2021) proposed a similar algorithm with a nested loop structure for the sake of improving its overall complexity. Xin et al (2020) and Jiang et al (2022) consider a similar GT-VR framework and obtain a linear rate for strongly convex problems and O (1/k) rate for non-convex setting, respectively. Similar attempts have been recently made towards composite optimization problems (Ye et al, 2020).…”
Section: Introductionmentioning
confidence: 99%
“…Li et al (2021) proposed a similar algorithm with a nested loop structure for the sake of improving its overall complexity. Xin et al (2020) and Jiang et al (2022) consider a similar GT-VR framework and obtain a linear rate for strongly convex problems and O (1/k) rate for non-convex setting, respectively. Similar attempts have been recently made towards composite optimization problems (Ye et al, 2020).…”
Section: Introductionmentioning
confidence: 99%