2021
DOI: 10.48550/arxiv.2104.03649
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Quantized Distributed Gradient Tracking Algorithm with Linear Convergence in Directed Networks

Abstract: Communication efficiency is a major bottleneck in the applications of distributed networks. To address the problem, the problem of quantized distributed optimization has attracted a lot of attention. However, most of the existing quantized distributed optimization algorithms can only converge sublinearly. To achieve linear convergence, this paper proposes a novel quantized distributed gradient tracking algorithm (Q-DGT) to minimize a finite sum of local objective functions over directed networks. Moreover, we … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(16 citation statements)
references
References 27 publications
0
16
0
Order By: Relevance
“…the second inequality holds due to the Cauchy-Schwarz inequality; and the last inequality holds due to (28). Denote C r (•) = C(•)/r, then we have…”
Section: B Proof Of Theoremmentioning
confidence: 99%
See 4 more Smart Citations
“…the second inequality holds due to the Cauchy-Schwarz inequality; and the last inequality holds due to (28). Denote C r (•) = C(•)/r, then we have…”
Section: B Proof Of Theoremmentioning
confidence: 99%
“…algorithms; [15] employed biased but contractive compressors to design a decentralized SGD algorithm; [16] and [17], [18] utilized unbiased compressors to respectively design decentralized gradient descent and primal-dual algorithms; [19] and [20] made use of the standard uniform quantizer to respectively design decentralized subgradient methods and alternating direction method of multipliers appraoches; [21], [22] and [23] respectively adopted the unbiased random quantization and the adaptive quantization to design decentralized projected subgradient algorithms; [24] and [25]- [28] exploited the standard uniform quantizer with dynamic quantization level to respectively design decentralized subgradient and primal-dual algorithms; and [29] applied the standard uniform quantizer with a fixed quantization level to design a decentralized gradient descent algorithm. The compressors mentioned above can be unified into three general classes.…”
Section: A Related Work and Motivationmentioning
confidence: 99%
See 3 more Smart Citations