2017
DOI: 10.48550/arxiv.1703.00548
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Evolving Deep Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
85
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 54 publications
(85 citation statements)
references
References 0 publications
0
85
0
Order By: Relevance
“…In this case, the genotype represents an abstraction for the implementation of a neural network. This representation can be direct, i.e., all nodes and connections of the neural architecture are encoded [22,28], or indirect, i.e., rules are specified to derive the concrete implementation of neural networks, such as in structured grammatical evolution [3,19].…”
Section: Neuroevolutionmentioning
confidence: 99%
See 3 more Smart Citations
“…In this case, the genotype represents an abstraction for the implementation of a neural network. This representation can be direct, i.e., all nodes and connections of the neural architecture are encoded [22,28], or indirect, i.e., rules are specified to derive the concrete implementation of neural networks, such as in structured grammatical evolution [3,19].…”
Section: Neuroevolutionmentioning
confidence: 99%
“…The genotype is a direct representation of the neural network, where NEAT defines two lists for the genome of individuals: a list of neurons and a list of connections between these neurons. A further expansion of NEAT was proposed to enable larger search spaces in DeepNEAT and CoDeepNEAT [22]. In these models, the genes composing a genome are abstractions of entire layers, enabling the representation of deep neural networks.…”
Section: Neuroevolutionmentioning
confidence: 99%
See 2 more Smart Citations
“…Importantly, recognising the advantages of evolution as a global optimiser, there has been a paradigm shift towards utilising NE as an optimiser for the network structure in combination with backpropagation (BP) to fine-tune the network weights. For instance, deep convolutional NNs (CNNs) with multiple layers and millions of parameters have been evolved for tasks ranging from image classification [16], [17], image captioning [17] (using an evolved deep Long Short-Term Memory (LSTM) network) and even applications in particle physics (neutron scattering model selection) [18]. A differentiable version of CPPN was proposed in [19] to efficiently compress the representation of deep CNNs.…”
Section: Imentioning
confidence: 99%