2018
DOI: 10.48550/arxiv.1803.06443
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Communication Compression for Decentralized Training

Abstract: Optimizing distributed learning systems is an art of balancing between computation and communication. There have been two lines of research that try to deal with slower networks: communication compression for low bandwidth networks, and decentralization for high latency networks. In this paper, We explore a natural question: can the combination of both techniques lead to a system that is robust to both bandwidth and latency?Although the system implication of such combination is trivial, the underlying theoreti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 36 publications
(10 citation statements)
references
References 17 publications
0
8
0
Order By: Relevance
“…Overall, the end-to-end training time is an acceptable training time (less than 1 hour) for practical applications. We can also find that the ratio of communication almost costs half of the end-to-end training time, indicating that the communication compression technique [7,18] is essential to improve the system performance in the IoT setting.…”
Section: Analysis Of System Efficiencymentioning
confidence: 96%
“…Overall, the end-to-end training time is an acceptable training time (less than 1 hour) for practical applications. We can also find that the ratio of communication almost costs half of the end-to-end training time, indicating that the communication compression technique [7,18] is essential to improve the system performance in the IoT setting.…”
Section: Analysis Of System Efficiencymentioning
confidence: 96%
“…However, FL does allow a more peer-to-peer learning approach wherein each node that is trained can benefit from the other node that is trained on the FL network too. Even in decentralized training similar challenges of communication exist and to tackle it different methods like compression [60] can be used. In this subsection how decentralized or peer-to-peer learning is utilized or integrated into an FL environment.…”
Section: ) Decentralizedmentioning
confidence: 99%
“…In addition, there are also multiple works investigating the communication reduction in decentralized optimization. [44] Fig. 1.…”
Section: Decentralized Optimizationmentioning
confidence: 99%