2022
DOI: 10.48550/arxiv.2201.03930
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Communication Compression for Distributed Nonconvex Optimization

Abstract: This paper considers decentralized nonconvex optimization with the cost functions being distributed over agents. Noting that information compression is a key tool to reduce the heavy communication load for decentralized algorithms as agents iteratively communicate with neighbors, we propose three decentralized primal-dual algorithms with compressed communication. The first two algorithms are applicable to a general class of compressors with bounded relative compression error and the third algorithm is suitable… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(8 citation statements)
references
References 26 publications
0
8
0
Order By: Relevance
“…In recent years, various techniques have been developed to reduce communication costs [27,31]. They are extensively incorporated into centralized optimization methods [1,23,30] and decentralized methods [8,9,18,20,33]. This motivates us to provide an extension of D-ASCGD by combining it with communication compressed method, which reads as follows.…”
Section: Compressed D-ascgd Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…In recent years, various techniques have been developed to reduce communication costs [27,31]. They are extensively incorporated into centralized optimization methods [1,23,30] and decentralized methods [8,9,18,20,33]. This motivates us to provide an extension of D-ASCGD by combining it with communication compressed method, which reads as follows.…”
Section: Compressed D-ascgd Methodsmentioning
confidence: 99%
“…Similar to (52), 24ȓ 2 (1+αyr 2 ψ 2 ) , (33) holds. Inequality (34) could be obtained by the similar analysis of (33).…”
Section: Proof Of Lemmamentioning
confidence: 97%
See 1 more Smart Citation
“…(2) primal-dual like methods [22,17,15,14,54,51]; (3) gradient tracking based algorithms [20,55,37,50,47].…”
Section: Related Workmentioning
confidence: 99%
“…The paper [22] equipped NIDS [18] with communication compression and demonstrated its linear convergence rate when the objective functions are smooth and strongly convex. More recently, the papers [20,37] considered a class of linearly convergent decentralized gradient tracking method with communication compression that applies to general directed graphs; see also [54,51] for more related works.…”
Section: Introductionmentioning
confidence: 99%