2023
DOI: 10.1109/tpds.2023.3238049
|View full text |Cite
|
Sign up to set email alerts
|

HiFlash: Communication-Efficient Hierarchical Federated Learning With Adaptive Staleness Control and Heterogeneity-Aware Client-Edge Association

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(7 citation statements)
references
References 31 publications
0
7
0
Order By: Relevance
“…The edge servers and the clients perform one-step gradient descent as (6) for τ + 1 intra-set iterations. Then, the edge servers forward the model parameters to the cloud server, and cloud aggregation is performed according to (10). The specialized optimality gap is given in the next corollary.…”
Section: B Convergence Analysismentioning
confidence: 99%
“…The edge servers and the clients perform one-step gradient descent as (6) for τ + 1 intra-set iterations. Then, the edge servers forward the model parameters to the cloud server, and cloud aggregation is performed according to (10). The specialized optimality gap is given in the next corollary.…”
Section: B Convergence Analysismentioning
confidence: 99%
“…Staleness in asynchronous SFL is further investigated in [21], which introduces an asynchronous update rule based on the notion that staleness effects in SFL are similar in consecutive training epochs. In [22] the connection bottleneck is tackled by combining synchronous orchestration with a dynamic aggregation rule that ignores stragglers.…”
Section: A Related Workmentioning
confidence: 99%
“…The edge server acts as an intermediate in the system by grouping the clients and fusing the machine learning models, which is referred to as edge-assisted FL. By preaggregating local models at the edge layer, the amount of data updated by the model can be reduced by an order of magnitude [6]. With the same bandwidth, the number of clients occupying the channel is reduced, and the available bandwidth is increased.…”
Section: Introductionmentioning
confidence: 99%
“…However, edge-assisted federated learning is constrained by the system heterogeneity, where the clients vary in their computing and communication capabilities; as well, data heterogeneity often happens in different clients with diverse local data. Furthermore, the bottleneck between communication from the cloud to clients is still a pressing issue [6]. For example, during the process of synchronous global model aggregation, the clients remain idle, so a significant amount of waiting time is wasted.…”
Section: Introductionmentioning
confidence: 99%