2023
DOI: 10.1007/s10479-023-05203-x
|View full text |Cite
|
Sign up to set email alerts
|

HFML: heterogeneous hierarchical federated mutual learning on non-IID data

Abstract: Federated learning (FL) has emerged as a privacy-preserving paradigm that trains neural networks on edge devices without collecting data at a central server. However, FL encounters an inherent challenge in dealing with non-independent and identically distributed (non-IID) data among devices. To address this challenge, this paper proposes a hard feature matching data synthesis (HFMDS) method to share auxiliary data besides local models. Specifically, synthetic data are generated by learning the essential class-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 54 publications
(67 reference statements)
0
2
0
Order By: Relevance
“…A different line of work includes the work by [55] that attempts to achieve global cluster centers by generating synthetic data at server instead of using original data. Similarly [32] draws inspiration from differential privacy which is quite different from our work. Federated Client Clustering: In federated client clustering, clients are clustered together to intelligently choose a subset of clients for the client update step.…”
Section: Related Workmentioning
confidence: 92%
“…A different line of work includes the work by [55] that attempts to achieve global cluster centers by generating synthetic data at server instead of using original data. Similarly [32] draws inspiration from differential privacy which is quite different from our work. Federated Client Clustering: In federated client clustering, clients are clustered together to intelligently choose a subset of clients for the client update step.…”
Section: Related Workmentioning
confidence: 92%
“…Li et al. [20] give a proof of Local SGD convergence in the case of non‐IID dataset and shows that Local SGD still performs well in terms of convergence rate even in the non‐IID case.…”
Section: Related Workmentioning
confidence: 99%
“…To avoid sharing additional information, CLIMB [69], Bal-anceFL [47], and FedLC [67] learn balanced federated model by reweighting each client's importance during aggregation based on the empirical loss, balanced class sampling with self-entropy regularization, and logits calibration with pairwise margins respectively. FedRS [70] limits classifier updates when there are missing classes. However, conforming the federated model to a balanced class distribution may be detrimental for some clients, e.g., being sub-optimal compared to their locally-learned models.…”
Section: B Class Imbalance Learningmentioning
confidence: 99%
“…From single-model FL, we include 1) local learning, where each client trains a model individually, 2) FedAvg [1] as a baseline comparison, and representative approaches including FedProx [32], MOON [23], CReRF [33], and the mostrecent state-of-the-art approaches such as BalanceFL [47], FedRS [70], and FedLC [67].…”
Section: Implementation Detailsmentioning
confidence: 99%