2023
DOI: 10.1587/transfun.2023eap1020
|View full text |Cite
|
Sign up to set email alerts
|

Quantized Gradient Descent Algorithm for Distributed Nonconvex Optimization

Abstract: This paper presents a quantized gradient descent algorithm for distributed nonconvex optimization in multiagent systems that takes into account the bandwidth limitation of communication channels. Each agent encodes its estimation variable using a zoom-in parameter and sends the quantized intermediate variable to the neighboring agents. Then, each agent updates the estimation by decoding the received information. In this paper, we show that all agents achieve consensus and their estimated variables converge to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 38 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?