Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery &Amp; Data Mining 2019
DOI: 10.1145/3292500.3330648
|View full text |Cite
|
Sign up to set email alerts
|

Auto-Keras: An Efficient Neural Architecture Search System

Abstract: Neural architecture search (NAS) has been proposed to automatically tune deep neural networks, but existing search algorithms, e.g., NASNet [41], PNAS [22], usually suffer from expensive computational cost. Network morphism, which keeps the functionality of a neural network while changing its neural architecture, could be helpful for NAS by enabling more efficient training during the search. In this paper, we propose a novel framework enabling Bayesian optimization to guide the network morphism for efficient n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
390
0
3

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 649 publications
(448 citation statements)
references
References 36 publications
(44 reference statements)
1
390
0
3
Order By: Relevance
“…Several major AutoML libraries have become quite popular since the initial introduction of Auto-Weka [48] in 2013. Currently, Auto-sklearn [49], TPOT [50], H2O-AutoML [51], Microsoft's NNI 11 , and AutoKeras [52] are the most popular ones among practitioners and further discussed in this section.…”
Section: Automatic Machine Learning (Automl)mentioning
confidence: 99%
See 1 more Smart Citation
“…Several major AutoML libraries have become quite popular since the initial introduction of Auto-Weka [48] in 2013. Currently, Auto-sklearn [49], TPOT [50], H2O-AutoML [51], Microsoft's NNI 11 , and AutoKeras [52] are the most popular ones among practitioners and further discussed in this section.…”
Section: Automatic Machine Learning (Automl)mentioning
confidence: 99%
“…AutoKeras introduced the morphism-based search space, allowing high performing models to be modified, rather than regenerated. Like NASBOT, AutoKeras defines a kernel for NAS architectures in order to use Gaussian processes for BO [52].…”
Section: Neural Architecture Searchmentioning
confidence: 99%
“…In recent years, interest in neural networks has exploded as they have proven to be state of the art algorithms for image classification [29], text classification [31], video classification [25], image captioning [67], visual question answering [39], and a host of other classic artificial intelligence problems. An increased interest in automated neural architecture searches has followed, resulting in a variety of algorithms using Bayesian optimization [56], network morphisms [20], or reinforcement learning [4,72]. These algorithms typically define the architecture space so that it is easily searchable by classical parameter space exploration techniques, such as gradient-based optimization [24,35].…”
Section: Related Work 31 Neural Architecture Searchmentioning
confidence: 99%
“…The Auptimizer executes jobs using the modified client.py with these configurations of client networks, and reports back to the original controller once all the generated child nets for the episode have finished running. The Proposer then computes gradients from the string of child architectures and the reported accuracies, and [14] is an open-source library for automated machine learning, and has recently become increasingly popular for NAS applications and research. The library includes a framework and different functions to search architecture space and hyperparameters for deep learning models.…”
Section: Neural Architecture Searchmentioning
confidence: 99%