2018 IEEE Visual Communications and Image Processing (VCIP) 2018
DOI: 10.1109/vcip.2018.8698609
|View full text |Cite
|
Sign up to set email alerts
|

Two-Stream Federated Learning: Reduce the Communication Costs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
42
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 95 publications
(47 citation statements)
references
References 5 publications
0
42
0
Order By: Relevance
“…The authors of [117] also note that the biggest issues in FL are mainly security and privacy. As such, efficient FL algorithms that deliver models with high performance and privacy protection without adding computational burden are desirable [118], [119]. Because local models are trained via newer data to highlight new updates, it is likely that adversaries can influence the local training data-sets to compromise the models' results.…”
Section: Challenges and Limitations Of Federated Learningmentioning
confidence: 99%
“…The authors of [117] also note that the biggest issues in FL are mainly security and privacy. As such, efficient FL algorithms that deliver models with high performance and privacy protection without adding computational burden are desirable [118], [119]. Because local models are trained via newer data to highlight new updates, it is likely that adversaries can influence the local training data-sets to compromise the models' results.…”
Section: Challenges and Limitations Of Federated Learningmentioning
confidence: 99%
“…Third, Sahu et al [55] (and similarly Yao et al [75]) propose including an additional regularisation term during local training such that the solution space for weights is close to the global weights of the last epoch.…”
Section: Non-iid Datamentioning
confidence: 99%
“…FL requires clients to repeatedly send their model parameter updates and in return receive the new global parameters. Especially when there is a large number of clients (>100) involved in training, the communication efficiency becomes the main bottleneck of FL to achieve a quick model convergence [8,22,43,50,52,59,62,75,78,103]. This motivates a branch of FL research looking into ways to improve the efficiency of information exchange between the participants.…”
Section: Communication Efficiencymentioning
confidence: 99%
See 1 more Smart Citation
“…Additionally, communication efficiency is a key issue of federated learning at present. Yao et al [ 28 ] presented a two-stream model rather than a single model for training to alleviate resource constraints with the maximum mean discrepancy principle each iteration. Vogels et al [ 29 ] introduced a novel low-rank gradient compression for power layer iteration to aggregate models rapidly and perform wall-clock speedups.…”
Section: Related Workmentioning
confidence: 99%