2022
DOI: 10.1007/978-3-031-22390-7_27
|View full text |Cite
|
Sign up to set email alerts
|

CFL: Cluster Federated Learning in Large-Scale Peer-to-Peer Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 5 publications
0
0
0
Order By: Relevance
“…Figure 5a shows how the code length evolves as the number of the parameters increases. We compared the expected code length of the proposed Elias coding for FedLP-Q (as in (18)), the Elias coding for QSGD (as in (19)), and the non-quantization coding (Float32). For a parameter count of less than 600 K, the Elias-based coding scheme proposed in Section 3.2.2 led to the lowest expected code length.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Figure 5a shows how the code length evolves as the number of the parameters increases. We compared the expected code length of the proposed Elias coding for FedLP-Q (as in (18)), the Elias coding for QSGD (as in (19)), and the non-quantization coding (Float32). For a parameter count of less than 600 K, the Elias-based coding scheme proposed in Section 3.2.2 led to the lowest expected code length.…”
Section: Resultsmentioning
confidence: 99%
“…For example, Horvath et al pointed out that the order of the parameters matters in pruning aggregation and should be taken into consideration [17]. Depth-wise schemes based on blocks and layers were also discussed in [18,19], which showed that model compression in FL should affects the functionality of different parts in local models. The FedLP schemes in [19] specifically provided common pruning solutions for model heterogeneity cases, which extends the FL scenarios.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…To demonstrate the accelerated intra-subgraph propagation process facilitated by AST in the IoV environment, a comparison is drawn with propagation patterns in the PPT algorithm [16] and the CFL algorithm [17]. Both algorithms are grounded in P2P-mode federated learning and employ a single-line propagation mode of depth-first traversal.…”
Section: Aggregated Simultaneous Transmissionmentioning
confidence: 99%