2023
DOI: 10.1109/jiot.2022.3209865
|View full text |Cite
|
Sign up to set email alerts
|

Federated Learning via Attentive Margin of Semantic Feature Representations

Abstract: Vision systems mounted on home robots need to interact with unseen classes in changing environments. Robots have limited computational resources, labelled data and storage capability. These requirements pose some unique challenges: models should adapt without forgetting past knowledge in a data-and parameter-efficient way. We characterize the problem as few-shot (FS) online continual learning (OCL), where robotic agents learn from a non-repeated stream of few-shot data updating only a few model parameters. Add… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 85 publications
0
4
0
Order By: Relevance
“…In FL, a first possible strategy is the correction of the client drift by computing client deviations using margins of prototypical representations learned on distributed data. These margins can be exploited to drive the federated optimization, via an attention mechanism [161], to address system and statistical heterogeneity. Prototypes can also be transmitted instead of model weights [203], to reduce communication cost, to allow clients to learn a more customized local model and to be more robust to gradient-based attacks [28], [274] since high-level statistic information (prototypes) are more privacy-compliant than raw features.…”
Section: Representation Learning Techniques Aim At Improving a Downst...mentioning
confidence: 99%
See 3 more Smart Citations
“…In FL, a first possible strategy is the correction of the client drift by computing client deviations using margins of prototypical representations learned on distributed data. These margins can be exploited to drive the federated optimization, via an attention mechanism [161], to address system and statistical heterogeneity. Prototypes can also be transmitted instead of model weights [203], to reduce communication cost, to allow clients to learn a more customized local model and to be more robust to gradient-based attacks [28], [274] since high-level statistic information (prototypes) are more privacy-compliant than raw features.…”
Section: Representation Learning Techniques Aim At Improving a Downst...mentioning
confidence: 99%
“…Additionally, the distribution of the data across different clients may be heterogeneous, which can lead to performance degradation if not addressed appropriately. Despite these challenges, recent works have demonstrated promising results in applying federated learning to computer vision tasks [24], [62], [161], [193], and ongoing research aims at improving the performance and scalability of FL in this domain (an overview of the distribution of the papers on the different tasks is shown in Fig. 4).…”
Section: Fl In Computer Vision: Tasks and Approachesmentioning
confidence: 99%
See 2 more Smart Citations