2022 30th European Signal Processing Conference (EUSIPCO) 2022
DOI: 10.23919/eusipco55093.2022.9909791
|View full text |Cite
|
Sign up to set email alerts
|

Finite Bit Quantization for Decentralized Learning Under Subspace Constraints

Abstract: In this paper, we consider decentralized optimization problems where agents have individual cost functions to minimize subject to subspace constraints that require the minimizers across the network to lie in low-dimensional subspaces. This constrained formulation includes consensus optimization as special case, and allows for more general task relatedness models such as multitask smoothness and coupled optimization. In order to cope with communication constraints, we propose and study a quantized differential … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 13 publications
0
0
0
Order By: Relevance
“…In order to reduce the communication bottleneck, the model updates are compressed (quantized or sparsified) by the nodes. Similar approach is proposed in [19] for decentralized learning in which the communication estimates among the users are quantized. Similar compression and quantization solutions are studied in [20] and [21] to solve distributed optimization problems with communication constraints.…”
Section: Background and Motivationsmentioning
confidence: 99%
“…In order to reduce the communication bottleneck, the model updates are compressed (quantized or sparsified) by the nodes. Similar approach is proposed in [19] for decentralized learning in which the communication estimates among the users are quantized. Similar compression and quantization solutions are studied in [20] and [21] to solve distributed optimization problems with communication constraints.…”
Section: Background and Motivationsmentioning
confidence: 99%