2020
DOI: 10.36227/techrxiv.12503420.v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Differentially-private Federated Neural Architecture Search

Abstract: Neural architecture search, which aims to automatically search for architectures (e.g., convolution, max pooling) of neural networks that maximize validation performance, has achieved remarkable progress recently. In many application scenarios, several parties would like to collaboratively search for a shared neural architecture by leveraging data from all parties. However, due to privacy concerns, no party wants its data to be seen by other parties. To address this problem, we propose federated neural archite… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 13 publications
(5 citation statements)
references
References 17 publications
(14 reference statements)
0
5
0
Order By: Relevance
“…In particular, with the assumption of adding a shared toy dataset in the federated setting, these KD-based FL methods can distill knowledge from a teacher model to student models with different model architectures. Some recent studies have also attempted to combine the neural architecture search with federated learning (Zhu, Zhang, and Jin 2020;He, Annavaram, and Avestimehr 2020;Singh et al 2020), which can be applied to discover a customized model architecture for each group of clients with different hardware capabilities and configurations. A collective learning platform is proposed to handle heterogeneous architectures without access to the local training data and architectures in (Hoang et al 2019).…”
Section: Heterogeneous Federated Learningmentioning
confidence: 99%
“…In particular, with the assumption of adding a shared toy dataset in the federated setting, these KD-based FL methods can distill knowledge from a teacher model to student models with different model architectures. Some recent studies have also attempted to combine the neural architecture search with federated learning (Zhu, Zhang, and Jin 2020;He, Annavaram, and Avestimehr 2020;Singh et al 2020), which can be applied to discover a customized model architecture for each group of clients with different hardware capabilities and configurations. A collective learning platform is proposed to handle heterogeneous architectures without access to the local training data and architectures in (Hoang et al 2019).…”
Section: Heterogeneous Federated Learningmentioning
confidence: 99%
“… Data Privacy : a detailed discussion of data privacy is given in [ 52 ]. To extend on the avenues mentioned there, the use of differentially private federated neural architecture search [ 142 ] is recommended to preserve data privacy. Through this method, a model can be tested on several subsets of data, which contain varied distributions and distinctions from other datasets and in parallel keep any information about the various data samples completely privatized.…”
Section: Related Reviews In the Fieldmentioning
confidence: 99%
“…To overcome the model heterogeneity, KD-based are gained more attention by distilling knowledge from the server model to client models, which have different model structures [29], [30]. Some researchers are integrating the network structure search and FL to discover customized models for clients with different computational capabilities [31], [32]. However, most of the methods above focused on a single heterogeneous challenging scenario of FL, and did not fully consider the available computing capability of each client.…”
Section: B Federated Learningmentioning
confidence: 99%