2021
DOI: 10.1109/tai.2021.3067574
|View full text |Cite
|
Sign up to set email alerts
|

Neuroevolution in Deep Neural Networks: Current Trends and Future Challenges

Abstract: A variety of methods have been applied to the architectural configuration and learning or training of artificial deep neural networks (DNN). These methods play a crucial role in the success or failure of the DNN for most problems and applications. Evolutionary algorithms (EAs) are gaining momentum as a computationally feasible method for the automated optimization of DNNs. Neuroevolution is a term, which describes these processes of automated configuration and training of DNNs using EAs. While many works exist… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
39
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 94 publications
(44 citation statements)
references
References 131 publications
(225 reference statements)
0
39
0
Order By: Relevance
“…In that sense, the architecture of the network (i.e., the number of layers and number of hidden nodes) can also be considered as part of the search space since they affect the size of the network. Neuroevolution techniques such as NEAT evolve both the weights and NN architecture through crossover and mutation [20]. Thus, compared to traditional optimizers (such as the stochastic gradient descent, SGD) which only optimize the weight, the search space for neuroevolution techniques is considerably larger [1].…”
Section: Discussion Of Practical Implementationmentioning
confidence: 99%
See 1 more Smart Citation
“…In that sense, the architecture of the network (i.e., the number of layers and number of hidden nodes) can also be considered as part of the search space since they affect the size of the network. Neuroevolution techniques such as NEAT evolve both the weights and NN architecture through crossover and mutation [20]. Thus, compared to traditional optimizers (such as the stochastic gradient descent, SGD) which only optimize the weight, the search space for neuroevolution techniques is considerably larger [1].…”
Section: Discussion Of Practical Implementationmentioning
confidence: 99%
“…This is accomplished by assigning a fitness value to each genome. Afterward, the fitness values are taken into account by a selection operator, e.g., tournament selection [20] (see Figure 3(c)). Crossover (lines #6 in Algorithm 1): The steps corresponding to fitness evaluation and selection are then followed by the crossover operation, also known as recombination, where the genetic information (e.g., the parameters of the NNs) of two selected individuals are combined as shown in Figure 3(d).…”
Section: Neuroevolution Of Augmenting Topologies (Neat)mentioning
confidence: 99%
“…Figure 4 shows the number of channels in the compressed and pre-trained VGG network on CIFAR10. We found that for deeper layers (11)(12)(13)(14)(15) the number of channels in the compressed network is <1% of the pre-trained network. This shows that the network is not learning much in deeper layers as the input resolution (2X2) for these layers is very small and does not provide enough information to learn unique features.…”
Section: Analysis Of Compressed Networkmentioning
confidence: 95%
“…Some approaches to automated architecture design have relied on sparsifying regularizers, often applying L1 regularization to obtain a sparse, subnetwork that can be extracted from the original large network [6,8,9]. Extensions to this work have also taken into account resource constraints such as latency and computation [5,10,11]. Other approaches build networks from the ground up using a set of custom building blocks, relying on variations of trial and error search to find promising architectures [12][13][14][15].…”
Section: Introductionmentioning
confidence: 99%
“…Deep neuroevolution (DNE) is a promising sub-field of evolutionary strategies/genetic algorithms that improves the performance of convolutional neural networks (CNNs) [25]. CNNs, along with transformers, currently dominate deep learning applications in radiology.…”
Section: Rano and Recist Are The Prevalent Formal Methods To Assess T...mentioning
confidence: 99%