2020
DOI: 10.1109/access.2020.3038287
|View full text |Cite
|
Sign up to set email alerts
|

EdgeFed: Optimized Federated Learning Based on Edge Computing

Abstract: Federated learning (FL) has received considerable attention with the development of mobile internet technology, which is an emerging framework to train a deep learning model from decentralized data. Modern mobile devices often have access to rich but privacy-sensitive data, and computational abilities are often limited because of the hardware restriction. In previous works based on federated averaging (FedAvg) algorithm, mobile devices need to perform lots of calculations, and it is time-consuming in the proce… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
59
0
2

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 130 publications
(61 citation statements)
references
References 21 publications
0
59
0
2
Order By: Relevance
“…There are some studies focused on federated edge computing. In [ 22 ], the authors proposed EdgeFed to apply local model updating to the edge server, which decreases computation cost and communication expense in the center data node. In [ 23 ], the authors designed a privacy-aware service placement (PSP) scheme in an edge-cloud system, which efficiently addressed individual privacy issues and provided better QoS to users.…”
Section: Related Workmentioning
confidence: 99%
“…There are some studies focused on federated edge computing. In [ 22 ], the authors proposed EdgeFed to apply local model updating to the edge server, which decreases computation cost and communication expense in the center data node. In [ 23 ], the authors designed a privacy-aware service placement (PSP) scheme in an edge-cloud system, which efficiently addressed individual privacy issues and provided better QoS to users.…”
Section: Related Workmentioning
confidence: 99%
“…However, such an idea where there is a middle entity in form of an edge server between sensing devices and the cloud is also gaining attention. The research work proposed in [108] is another effort based on a new hierarchical version of FL. In this, the edge serves as a local aggregator and the cloud as a global aggregator.…”
Section: Federated Learning and Iiotmentioning
confidence: 99%
“…In such a case, a local model update could report negative knowledge to the global aggregator as FL is based on data similarity of the participating clients [116]. However, a new version of FL in which a local server aggregates the data from IoT nodes and serves as a client to a global aggregator can help in overcoming such issues [108].…”
Section: ) Observations On Flmentioning
confidence: 99%
“…), without exchanging data among the nodes of the network (i.e. maintaining the training data of each node locally, in a decentralized way) [44] [45] [46]. Especially the latter characteristic renders FL a by-definition privacy-aware method, which can reliably mitigate many of the systemic privacy risks and costs resulting from traditional centralized ML [47] [48] [49].…”
Section: Federated Learning Paradigmmentioning
confidence: 99%