IEEE INFOCOM 2020 - IEEE Conference on Computer Communications 2020
DOI: 10.1109/infocom41043.2020.9155432
|View full text |Cite
|
Sign up to set email alerts
|

Communication-Efficient Network-Distributed Optimization with Differential-Coded Compressors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 14 publications
0
1
0
Order By: Relevance
“…Similar compression ideas have also been extended to decentralized learning over networks without dedicated parameter servers. In [39,40], Zhang et al developed a series of decentralized learning algorithms with differential-coded compressions. Koloskova et al [14] proposed an algorithm that could achieve a linear speedup with respect to the number of workers for convergence in decentralized learning with arbitrary gradient compression.…”
Section: Introductionmentioning
confidence: 99%
“…Similar compression ideas have also been extended to decentralized learning over networks without dedicated parameter servers. In [39,40], Zhang et al developed a series of decentralized learning algorithms with differential-coded compressions. Koloskova et al [14] proposed an algorithm that could achieve a linear speedup with respect to the number of workers for convergence in decentralized learning with arbitrary gradient compression.…”
Section: Introductionmentioning
confidence: 99%