2023
DOI: 10.1109/tsp.2023.3250839
|View full text |Cite
|
Sign up to set email alerts
|

Decentralized Inexact Proximal Gradient Method With Network-Independent Stepsizes for Convex Composite Optimization

Abstract: Distributed optimization methods with probabilistic local updates have recently gained attention for their provable ability to communication acceleration. Nevertheless, this capability is effective only when the loss function is smooth and the network is sufficiently well-connected. In this paper, we propose the first linear convergent method MG-SKIP with probabilistic local updates for nonsmooth distributed optimization. Without any extra condition for the network connectivity, MG-SKIP allows for the multiple… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
references
References 54 publications
0
0
0
Order By: Relevance