2021
DOI: 10.48550/arxiv.2112.07836
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Communication-Efficient Distributed SGD with Compressed Sensing

Yujie Tang,
Vikram Ramanathan,
Junshan Zhang
et al.

Abstract: We consider large scale distributed optimization over a set of edge devices connected to a central server, where the limited communication bandwidth between the server and edge devices imposes a significant bottleneck for the optimization procedure. Inspired by recent advances in federated learning, we propose a distributed stochastic gradient descent (SGD) type algorithm that exploits the sparsity of the gradient, when possible, to reduce communication burden. At the heart of the algorithm is to use compresse… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 23 publications
(35 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?