2020
DOI: 10.3390/electronics9091359
|View full text |Cite
|
Sign up to set email alerts
|

Optimal User Selection for High-Performance and Stabilized Energy-Efficient Federated Learning Platforms

Abstract: Federated learning-enabled edge devices train global models by sharing them while avoiding local data sharing. In federated learning, the sharing of models through communication between several clients and central servers results in various problems such as a high latency and network congestion. Moreover, battery consumption problems caused by local training procedures may impact power-hungry clients. To tackle these issues, federated edge learning (FEEL) applies the network edge technologies of mobile edge co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 15 publications
(8 citation statements)
references
References 29 publications
0
8
0
Order By: Relevance
“…Several existing works have shown that local models from all existing workers in the edge ecosystem need not be required to obtain an efficient model [13], [28], [29]. In the next set of experiments, we show that relying on a lesser number of workers improves C considerably without compromising on A.…”
Section: B Optimal Worker Percentage For Federated Learningmentioning
confidence: 65%
See 1 more Smart Citation
“…Several existing works have shown that local models from all existing workers in the edge ecosystem need not be required to obtain an efficient model [13], [28], [29]. In the next set of experiments, we show that relying on a lesser number of workers improves C considerably without compromising on A.…”
Section: B Optimal Worker Percentage For Federated Learningmentioning
confidence: 65%
“…Wang et al [12] show the potential of integrating Deep Learning and Federated Learning in an Edge computing environment and how edge computing is more efficient in handling the heterogeneous node capabilities than other networks. Several works talk about node capabilities when we have the scenarios of heterogeneous resource allocation [13]- [15]. However, these are only aimed at minimizing the computational time and power and might compromise on latency.…”
Section: Related Workmentioning
confidence: 99%
“…[12] developed a self balancing system based on mediator edge servers, gathering near uniform data distribution subsets of clients, and aggregating the trained models, before sending them to the central server to build a global one; Similarly, [39] achieved energy consumption reduction by balancing the exchange of parameters with L edge servers with respect to the training time and communication budget where each edge server incorporates a small number of clients [66] used a hierarchical aggregation of the model updates to overcome the communication overload between the nodes and the server. Moreover, [25] and [74] proposed a cloud-edge-client scheme wherein the clients offload a part or all the training tasks to the edge servers, which get portions of the clients' data for the training. This approach has some flaws w.r.t.…”
Section: Hybrid Schemementioning
confidence: 99%
“…In the same manner, Jeon et al [171] presented an adaptive client-selection algorithm in FEEL, according to the device resources and data quality. Energy-limited end-devices are relieved from high computations power consumption through sending the dataset to the associated federated edges.…”
Section: B Clients Selectionmentioning
confidence: 99%