2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2019
DOI: 10.1109/cvpr.2019.00492
|View full text |Cite
|
Sign up to set email alerts
|

RENAS: Reinforced Evolutionary Neural Architecture Search

Abstract: Neural Architecture Search (NAS) is an important yet challenging task in network design due to its high computational consumption. To address this issue, we propose the Reinforced Evolutionary Neural Architecture Search (RE-NAS), which is an evolutionary method with reinforced mutation for NAS. Our method integrates reinforced mutation into an evolution algorithm for neural architecture exploration, in which a mutation controller is introduced to learn the effects of slight modifications and make mutation acti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
73
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 117 publications
(73 citation statements)
references
References 24 publications
0
73
0
Order By: Relevance
“…All the deep learning related frameworks are implemented with the PyTorch package 5 . Every single search experiment is run on a single GPU (NVIDIA GeForce RTX 2080 Ti) with three architectures parallelly trained on it.…”
Section: C3 Software and Hardware Descriptionsmentioning
confidence: 99%
“…All the deep learning related frameworks are implemented with the PyTorch package 5 . Every single search experiment is run on a single GPU (NVIDIA GeForce RTX 2080 Ti) with three architectures parallelly trained on it.…”
Section: C3 Software and Hardware Descriptionsmentioning
confidence: 99%
“…Furthermore, in this model, the author benefitted from a new regularization method called ScheduledDropPath. Moreover, from the advantage of using Controller Recurrent Neural Network (CRNN), CNN, and reinforced evolutionary algorithm, this model is able to choose the best cell candidate to form the blocks and end up building the best architecture depending on the database [72]- [73]. In this model, the controller RNN generates sample architecture with a sample probability by using a set of operations.…”
Section: ) Neural Architecture Search Network (Nasnet)mentioning
confidence: 99%
“…Early genetic algorithm-based methods still require independent training of each network architecture from scratch, so these methods also require high computational costs, such as the search of AmoebaNet [14] which took 3150 GPU days. However, some of the recent methods have achieved state-of-the-art performance with fewer iterations and computation time [27,28].…”
Section: Genetic Algorithmmentioning
confidence: 99%