Proceedings of the 22nd International Middleware Conference 2021
DOI: 10.1145/3464298.3493403
|View full text |Cite
|
Sign up to set email alerts
|

Implicit model specialization through dag-based decentralized federated learning

Abstract: Federated learning allows a group of distributed clients to train a common machine learning model on private data. The exchange of model updates is managed either by a central entity or in a decentralized way, e.g. by a blockchain. However, the strong generalization across all clients makes these approaches unsuited for non-independent and identically distributed (non-IID) data.We propose a unified approach to decentralization and personalization in federated learning that is based on a directed acyclic graph … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(13 citation statements)
references
References 52 publications
0
7
0
Order By: Relevance
“…These features of DAG-DLT make it a suitable choice for decentralized FL in edge device environments where resource limitations and connectivity constraints are prevalent. Among the existing studies [20], [30]- [33], an Implicit Model Specialization FL called Specializing DAG FL (SDAFGL) [20] proposed by Beilharz et al is attention-gaining. SDAFGL focuses on improving model accuracy locally on the client side.…”
Section: Federated Learning With Dag-dltmentioning
confidence: 99%
See 4 more Smart Citations
“…These features of DAG-DLT make it a suitable choice for decentralized FL in edge device environments where resource limitations and connectivity constraints are prevalent. Among the existing studies [20], [30]- [33], an Implicit Model Specialization FL called Specializing DAG FL (SDAFGL) [20] proposed by Beilharz et al is attention-gaining. SDAFGL focuses on improving model accuracy locally on the client side.…”
Section: Federated Learning With Dag-dltmentioning
confidence: 99%
“…And for the CNN model, we use the last two layers of the model for similarity measures. To assess the performance of our framework, we compared it with five baselines: FedAvg [4], FlexCFL [11], IFCA [12], FeSEM [13], and SDAFGL [20]. The details are as follows.…”
Section: A Experimental Setupmentioning
confidence: 99%
See 3 more Smart Citations