2020
DOI: 10.48550/arxiv.2006.10559
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Differentially-private Federated Neural Architecture Search

Abstract: Neural architecture search, which aims to automatically search for architectures (e.g., convolution, max pooling) of neural networks that maximize validation performance, has achieved remarkable progress recently. In many application scenarios, several parties would like to collaboratively search for a shared neural architecture by leveraging data from all parties. However, due to privacy concerns, no party wants its data to be seen by other parties. To address this problem, we propose federated neural archite… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 16 publications
0
8
0
Order By: Relevance
“…In terms of network topology, a variety of network typologies such as vertical FL [34,35,36,37,38,39,40], split learning [41,42], decentralized FL [43,44,45,46], hierarchical FL [47,48,49,50,51,52], and meta FL [53,54,55] have been proposed. In terms of exchanged information, aside from exchanging gradients and models, recent FL algorithms propose to exchange information such as pseudo labels in semi-supervised FL [56] and architecture parameters in neural architecture search-based FL [5,57,58]. In terms of training procedures, the training procedures in federated GAN [59,60] and transfer learning-based FL [61,62,63,64,65] are significantly different from the vanilla FedAvg algorithm [66].…”
Section: Introductionmentioning
confidence: 99%
“…In terms of network topology, a variety of network typologies such as vertical FL [34,35,36,37,38,39,40], split learning [41,42], decentralized FL [43,44,45,46], hierarchical FL [47,48,49,50,51,52], and meta FL [53,54,55] have been proposed. In terms of exchanged information, aside from exchanging gradients and models, recent FL algorithms propose to exchange information such as pseudo labels in semi-supervised FL [56] and architecture parameters in neural architecture search-based FL [5,57,58]. In terms of training procedures, the training procedures in federated GAN [59,60] and transfer learning-based FL [61,62,63,64,65] are significantly different from the vanilla FedAvg algorithm [66].…”
Section: Introductionmentioning
confidence: 99%
“…However, only a limited amount of research on the influence of noni.i.d. distribution on federated NAS has been reported (He, Annavaram, and Avestimehr 2020;Singh et al 2020). Adversarial training was primarily developed for i.i.d.…”
Section: (D) Optimization For Defense Mechanismsmentioning
confidence: 99%
“…• There is an increasing demand for automated machine learning (AutoML) and lots of algorithms have been presented on neural architecture search (NAS) in practical scenarios [22,133]. However, only a limited amount of research on federated NAS has been reported [145,121,48,169] and little work has considered the influence of Non-IID distributions.…”
Section: Remaining Challenges and Future Directionsmentioning
confidence: 99%