GLOBECOM 2022 - 2022 IEEE Global Communications Conference 2022
DOI: 10.1109/globecom48099.2022.10000892
|View full text |Cite
|
Sign up to set email alerts
|

ChannelFed: Enabling Personalized Federated Learning via Localized Channel Attention

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 12 publications
0
3
0
Order By: Relevance
“…Concretely, the attention value is computed according to the average reward, average loss, training data size, etc, increasing the possibility of obtaining a more powerful agent model after aggregation. The attention-based module is also widely used for personalized federated learning [21,22]. In [22], the authors design a PFL framework termed ChannelFed that uses an attention module to assign different weights to channels on the client side.…”
Section: Attentive Aggregationmentioning
confidence: 99%
See 1 more Smart Citation
“…Concretely, the attention value is computed according to the average reward, average loss, training data size, etc, increasing the possibility of obtaining a more powerful agent model after aggregation. The attention-based module is also widely used for personalized federated learning [21,22]. In [22], the authors design a PFL framework termed ChannelFed that uses an attention module to assign different weights to channels on the client side.…”
Section: Attentive Aggregationmentioning
confidence: 99%
“…The attention-based module is also widely used for personalized federated learning [21,22]. In [22], the authors design a PFL framework termed ChannelFed that uses an attention module to assign different weights to channels on the client side. After incorporating personalized channel attention, the performance of the local model can be improved and client-specific knowledge can be better captured.…”
Section: Attentive Aggregationmentioning
confidence: 99%
“…Consequences of FL failure include: (1) clients being unwilling to participate in FL, (2) wasted rounds of client computation (and client-server interactions), and (3) disintegration of the entire federation in the worst case. Many remedy solutions have been proposed to prevent FL failure [9,12,[17][18][19][20][21][22][23][24][25][26][27]. However, the FL system using the existing solutions faces a dilemma: If the remedy is predetermined to be used, it incurs extra (high) costs even if FL could have done well without such a remedy.…”
Section: Introductionmentioning
confidence: 99%