2019
DOI: 10.48550/arxiv.1903.08149
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Nested Distributed Gradient Methods with Adaptive Quantized Communication

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2019
2019

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 35 publications
0
2
0
Order By: Relevance
“…We then generalize the results to the case where the number of communication steps varies at every iteration, and the quantization is adaptive-NEAR-DGD + +Q. For brevity we omit some of the proofs, and refer interested readers to [44]. We make the following assumptions that are standard in the distributed optimization literature [10], [11], [30].…”
Section: Convergence Analysismentioning
confidence: 99%
“…We then generalize the results to the case where the number of communication steps varies at every iteration, and the quantization is adaptive-NEAR-DGD + +Q. For brevity we omit some of the proofs, and refer interested readers to [44]. We make the following assumptions that are standard in the distributed optimization literature [10], [11], [30].…”
Section: Convergence Analysismentioning
confidence: 99%
“…Reisizadeh et al (2018a) propose an algorithm that can still converge, though at a slower rate than the exact scheme. Another line of work proposed adaptive schemes (with increasing compression accuracy) that converge at the expense of higher communication cost (Carli et al, 2010a;Doan et al, 2018;Berahas et al, 2019). For deep learning applications, Tang et al (2018) proposed the DCD algorithm that converges at the same rate than the centralized baseline, O 1 / √ nT + n /(ρ 2 T ) , though only for quantization with quality δ = max 1 − Θ(ρ 2 ), 1 /4 , which may be limiting (not allowing the for the desired compression ratio) on many network topologies (e.g.…”
Section: Related Workmentioning
confidence: 99%