Proceedings of the Genetic and Evolutionary Computation Conference 2019
DOI: 10.1145/3321707.3321721
|View full text |Cite
|
Sign up to set email alerts
|

Evolutionary neural AutoML for deep learning

Abstract: Deep neural networks (DNNs) have produced state-of-the-art results in many benchmarks and problem domains. However, the success of DNNs depends on the proper configuration of its architecture and hyperparameters. Such a configuration is difficult and as a result, DNNs are often not used to their full potential. In addition, DNNs in commercial applications often need to satisfy real-world design constraints such as size or number of parameters. To make configuration easier, automatic machine learning (AutoML) s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
35
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
3
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 98 publications
(40 citation statements)
references
References 33 publications
0
35
0
Order By: Relevance
“…The performance of BP algorithm and NEAT algorithm reduce sharply when iterations is the same as that of GA-CANS. In addition, one kind of AutoML [30], autosklearn, is used to compare with GA-CANS, which takes a long time to get the optimal model. When the training time of auto-sklearn is reduced to the same as that of GA-CANS, the appropriate network model cannot be obtained in a short time, so the accuracy reduces rapidly.…”
Section: ) Improvement Of Roulette Methods On Invalid Connection Deletionmentioning
confidence: 99%
“…The performance of BP algorithm and NEAT algorithm reduce sharply when iterations is the same as that of GA-CANS. In addition, one kind of AutoML [30], autosklearn, is used to compare with GA-CANS, which takes a long time to get the optimal model. When the training time of auto-sklearn is reduced to the same as that of GA-CANS, the appropriate network model cannot be obtained in a short time, so the accuracy reduces rapidly.…”
Section: ) Improvement Of Roulette Methods On Invalid Connection Deletionmentioning
confidence: 99%
“…Some studies explore a large number of datasets [4], [5]. Our comparison adopts 12 datasets, which is below the two mentioned works but is still higher than used in eleven other studies (e.g., [6], [7]). More importantly, we consider eight AutoML technologies, which is a number only outperformed by [8] (which tested only one dataset) and [9] (which did not use any datasets).…”
Section: Related Workmentioning
confidence: 99%
“…Therefore, the networks are often trained for a fixed (low) number of epochs (e.g., 8-10 epochs). To overcome the burden of evolution we can use clusters of Graphic Processing Units (GPUs) (e.g., Amazon AWS, or Google Cloud) [22], evaluate the candidate solutions in a limited amount of data instances [7], or train for a fixed amount of epochs/time and let evolution resume the training in a subsequent generation by loading the previous weights [23]. In the current work we use a variant of Deep Evolutionary Network Structured Representation (DENSER) [20] to search for Convolutional Neural Networks (CNNs) to distinguish between gamma radiations and protons.…”
Section: Neuroevolutionmentioning
confidence: 99%