1993
DOI: 10.1002/int.4550080406
|View full text |Cite
|
Sign up to set email alerts
|

A review of evolutionary artificial neural networks

Abstract: Research on potential interactions between connectionist learning systems, i.e., artificial neural networks (ANNs), and evolutionary search procedures, like genetic algorithms (GAS), has attracted a lot of attention recently. Evolutionary ANNs (EANNs) can be considered as the combination of ANNs and evolutionary search procedures. This article first distinguishes among three kinds of evolution in EANNs, i.e., the evolution of connection weights, of architectures, and of learning rules. Then it reviews each kin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
141
0
11

Year Published

2007
2007
2022
2022

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 399 publications
(152 citation statements)
references
References 33 publications
0
141
0
11
Order By: Relevance
“…1. Many RNN evolvers have been proposed (e.g., Miller et al, 1989;Wieland, 1991;Cliff et al, 1993;Yao, 1993;Nolfi et al, 1994a;Sims, 1994;Yamauchi and Beer, 1994;Miglino et al, 1995;Moriarty, 1997;Pasemann et al, 1999;Juang, 2004;Whiteson, 2012). One particularly effective family of methods coevolves neurons, combining them into networks, and selecting those neurons for reproduction that participated in the best-performing networks (Moriarty and Miikkulainen, 1996;Gomez and Miikkulainen, 2003).…”
Section: Deep Hierarchical Rl (Hrl) and Subgoal Learning With Fnns Anmentioning
confidence: 99%
“…1. Many RNN evolvers have been proposed (e.g., Miller et al, 1989;Wieland, 1991;Cliff et al, 1993;Yao, 1993;Nolfi et al, 1994a;Sims, 1994;Yamauchi and Beer, 1994;Miglino et al, 1995;Moriarty, 1997;Pasemann et al, 1999;Juang, 2004;Whiteson, 2012). One particularly effective family of methods coevolves neurons, combining them into networks, and selecting those neurons for reproduction that participated in the best-performing networks (Moriarty and Miikkulainen, 1996;Gomez and Miikkulainen, 2003).…”
Section: Deep Hierarchical Rl (Hrl) and Subgoal Learning With Fnns Anmentioning
confidence: 99%
“…3. Yao [132,255] summarized all such form of adaptation in evolutionary artificial neural network (EANN), which is a special class of artificial neural network, where in addition to learning; evolution is another fundamental form of adaptation. Infact a paradigm, called Neuroevolution that accommodates adaptive learning all or some components of FNN in some intuitive ways by applying EAs.…”
Section: Learning Algorithm Optimizationmentioning
confidence: 99%
“…This is often unrealistic. A more general approach for partially observable environments directly evolves programs for RNN with internal states (no need for the Markovian assumption), by applying evolutionary algorithms [30,50,83] to RNN weight matrices [26,85,89,105]. Recent work brought progress through a focus on reducing search spaces by co-evolving the comparatively small weight vectors of individual neurons and synapses [21], by Natural Gradient-based Stochastic Search Strategies [19,53,92,93,102,103], and by reducing search spaces through weight matrix compression [36,61].…”
Section: Recurrent / Deep Neural Networkmentioning
confidence: 99%