2018
DOI: 10.1007/s10710-018-9339-y
|View full text |Cite
|
Sign up to set email alerts
|

DENSER: deep evolutionary network structured representation

Abstract: Deep Evolutionary Network Structured Representation (DENSER) is a novel approach to automatically design Artificial Neural Networks (ANNs) using Evolutionary Computation. The algorithm not only searches for the best network topology (e.g., number of layers, type of layers), but also tunes hyper-parameters, such as, learning parameters or data augmentation parameters. The automatic design is achieved using a representation with two distinct levels, where the outer level encodes the general structure of the netw… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
43
0
1

Year Published

2019
2019
2020
2020

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 99 publications
(56 citation statements)
references
References 56 publications
0
43
0
1
Order By: Relevance
“…From these results we hypothesise that superior generalisation performances are obtained by incremental development, when passing from more simple to more challenging datasets. That is the reason why there is no statistical difference in the CIFAR-10 vs. MNIST )SVHN )CIFAR-10 setups: the CIFAR-10 is per-se more challenging to solve than the remaining ones, and therefore, as already noticed in a previous article [13], the DANNs generated for addressing CIFAR-10 tend to be able to solve other easier problems. The remarkable aspect of incremental development is when a DANN optimised for Fashion is able to get better results on the CIFAR-10, compared to when the DANNs for Fashion are not evolved in an incremental fashion.…”
Section: Experimental Results: Generalisation Of the Modelsmentioning
confidence: 70%
See 2 more Smart Citations
“…From these results we hypothesise that superior generalisation performances are obtained by incremental development, when passing from more simple to more challenging datasets. That is the reason why there is no statistical difference in the CIFAR-10 vs. MNIST )SVHN )CIFAR-10 setups: the CIFAR-10 is per-se more challenging to solve than the remaining ones, and therefore, as already noticed in a previous article [13], the DANNs generated for addressing CIFAR-10 tend to be able to solve other easier problems. The remarkable aspect of incremental development is when a DANN optimised for Fashion is able to get better results on the CIFAR-10, compared to when the DANNs for Fashion are not evolved in an incremental fashion.…”
Section: Experimental Results: Generalisation Of the Modelsmentioning
confidence: 70%
“…NeuroEvolution (NE) approaches are usually grouped according to the target of evolution, i.e., topology [3,4], learning (i.e., weights, parameters, or learning policies) [5,6,7], or the simultaneous evolution of the topology and learning [8,9]. Nonetheless, more recent efforts have been put towards the proposal of methods that deal with the optimisation DANNs, and thus we feel that it is more intuitive to divide them into small-scale [5,8] and large-scale [7,10,11,12,13] NE. The current paper focuses on the latter; a complete survey can be found in [14].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In order to design the Artificial Neural Networks (ANN) automatically with evolutionary Fig. 10: A Generative Adversarial Network Architecture computation a Deep Evolutionary Network Structured Representation (DENSER) was proposed in [121], where the optimal design for the network is achieved by a bi-leveled representation. The outer level deals with the number of layers and their sequence whereas the inner layer optimizes the parameters and hyper parameters associated with each layer defined by a context-free human perceivable grammar.…”
Section: Swarm Intelligence In Deep Learningmentioning
confidence: 99%
“…Deep Evolutionary Network Structured Representation (DENSER) [20], is a general-purpose grammar-based Neu-roEvolution (NE) approach. It has successfully been applied in object detection tasks, and all the user inputs are defined in a human-readable format, and thus the framework is easy to adapt to different domains and network structures.…”
Section: Deep Evolutionary Network Structured Representationmentioning
confidence: 99%