2020 International Conferences on Internet of Things (iThings) and IEEE Green Computing and Communications (GreenCom) and IEEE 2020
DOI: 10.1109/ithings-greencom-cpscom-smartdata-cybermatics50389.2020.00119
|View full text |Cite
|
Sign up to set email alerts
|

A Selective Model Aggregation Approach in Federated Learning for Online Anomaly Detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 13 publications
(8 citation statements)
references
References 13 publications
0
8
0
Order By: Relevance
“…In Qin et al (2020), the authors worked on online anomaly detection in FL. They focus on ML methods to build local anomaly detection on edge devices.…”
Section: Fl Applicationsmentioning
confidence: 99%
“…In Qin et al (2020), the authors worked on online anomaly detection in FL. They focus on ML methods to build local anomaly detection on edge devices.…”
Section: Fl Applicationsmentioning
confidence: 99%
“…The more machine/institutions are taking part in a federation, the more important is the ability to scale. As mentioned in the previous Section, to the best of our knowledge, a consolidated way for detecting "poor" training contributions (coming from institutions with corrupted or redundant data) is still missing, and aggregation functions are currently being evaluated by the research community [26] [31] [33]. Another implication when talking about big scales is represented by the infrastructure and the connectivity chosen by the institutions for communication [14].…”
Section: Fl Challengesmentioning
confidence: 99%
“…Rather than simple adoption, other papers seek to improve the FL framework in various ways. Study [29] concerns that some clients participating in FL could have bad data or be under attack, and thus will send low-quality weights to the aggregation server that negatively affects the global model. It proposes the server to test each local model sent by each client on some preset data first, and discarding those that yield a too high loss value.…”
Section: Federated Learningmentioning
confidence: 99%