2021
DOI: 10.48550/arxiv.2112.00171
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Improving Differentiable Architecture Search with a Generative Model

Abstract: In differentiable neural architecture search (NAS) algorithms like DARTS, the training set used to update model weight and the validation set used to update model architectures are sampled from the same data distribution. Thus, the uncommon features in the dataset fail to receive enough attention during training. In this paper, instead of introducing more complex NAS algorithms, we explore the idea that adding quality synthesized datasets into training can help the classification model identify its weakness an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 37 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?