2020
DOI: 10.1016/j.ins.2020.05.137
|View full text |Cite
|
Sign up to set email alerts
|

Federated learning with adaptive communication compression under dynamic bandwidth and unreliable networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 36 publications
(9 citation statements)
references
References 9 publications
0
6
0
Order By: Relevance
“…However, it might be unreasonable to assume that each communication link has the same loss rate since it varies from the duration and packet size of different packets in reality. Consequently, Zhang et al [110] design a new algorithm called ACFL to adaptively compress the information from the shared model based on the physical conditions of the current network, taking into account the drop rate and the total number of transmitted packets.…”
Section: Unreliable Networkmentioning
confidence: 99%
“…However, it might be unreasonable to assume that each communication link has the same loss rate since it varies from the duration and packet size of different packets in reality. Consequently, Zhang et al [110] design a new algorithm called ACFL to adaptively compress the information from the shared model based on the physical conditions of the current network, taking into account the drop rate and the total number of transmitted packets.…”
Section: Unreliable Networkmentioning
confidence: 99%
“…When the local model completes a model parameter update, the change in the model weight is recorded, and these recorded data are uploaded to the central server. In this process, the local model training process of each computing node is relatively independent, and communication is achieved through the coordination of the central server [29]. At the beginning of the next iteration, the local model uses the parameters download from the central server as the initial model parameters and then uses the constrained stochastic gradient descent method to update the local model.…”
Section: Differential Privacy Preprocessing Of the Local Datamentioning
confidence: 99%
“…Federated learning [31], [32] is a new upcoming area of research in decentralized training in an edge-cloud setting i.e., there is a common shared model that is trained on millions of IoT devices using the local data that is present on the device itself. The updated model is then sent to the cloud for global model aggregation.…”
Section: Related Work and Motivationmentioning
confidence: 99%