2021 IEEE Global Communications Conference (GLOBECOM) 2021
DOI: 10.1109/globecom46510.2021.9685095
|View full text |Cite
|
Sign up to set email alerts
|

BePOCH: Improving Federated Learning Performance in Resource-Constrained Computing Devices

Abstract: Inference with trained machine learning models is now possible with small computing devices while only a few years ago it was run mostly in the cloud only. The recent technique of Federated Learning offers now a way to do also the training of the machine learning models on small devices by distributing the computing effort needed for the training over many distributed machines. But, the training on these low-capacity devices takes a long time and often consumes all the available CPU resource of the device. The… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 12 publications
0
3
0
Order By: Relevance
“…The communication cost is incurred due to the downloading of the global model from service S m to each UE and uploading the trained local models of UEs. Following existing studies [41], [15], [56], we assume that the communication cost is mainly due to the consumption of bandwidth resources, which is proportional to the amount of data to be transferred. Let c t k,q,i be the cost of transmitting a unit amount of data from UE ue k to location Loc q via base station bs i .…”
Section: Cost Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…The communication cost is incurred due to the downloading of the global model from service S m to each UE and uploading the trained local models of UEs. Following existing studies [41], [15], [56], we assume that the communication cost is mainly due to the consumption of bandwidth resources, which is proportional to the amount of data to be transferred. Let c t k,q,i be the cost of transmitting a unit amount of data from UE ue k to location Loc q via base station bs i .…”
Section: Cost Modelsmentioning
confidence: 99%
“…Following the similar derivation in Ineq. (15), the probability of violating the computing resource capacity on each Loc…”
Section: Proofmentioning
confidence: 99%
“…This approach uses weight-sharing methods for searching neural architectures. The training process for FL must not only be aimed at high accuracy but also at reducing the training time and resource consumption in practical environments, using low-capacity computing devices [37,38]. FL uses best epoch algorithm to determine how many epochs are necessary per training round.…”
Section: Evolutionary Algorithmsmentioning
confidence: 99%