2009
DOI: 10.1162/artl.2009.15.2.15202
|View full text |Cite
|
Sign up to set email alerts
|

A Hypercube-Based Encoding for Evolving Large-Scale Neural Networks

Abstract: Research in neuroevolution-that is, evolving artificial neural networks (ANNs) through evolutionary algorithms-is inspired by the evolution of biological brains, which can contain trillions of connections. Yet while neuroevolution has produced successful results, the scale of natural brains remains far beyond reach. This article presents a method called hypercube-based NeuroEvolution of Augmenting Topologies (HyperNEAT) that aims to narrow this gap. HyperNEAT employs an indirect encoding called connective comp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
613
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 661 publications
(634 citation statements)
references
References 34 publications
0
613
0
Order By: Relevance
“…In HyperNEAT (D'Ambrosio and Stanley, 2007;Gauci and Stanley, 2007;Stanley et al, 2009), neurons are embedded into a substrate that has an externally specified topology. For example, the substrate can be a two-dimensional plane.…”
Section: The Neat Neuroevolution Methods and Derivativesmentioning
confidence: 99%
See 1 more Smart Citation
“…In HyperNEAT (D'Ambrosio and Stanley, 2007;Gauci and Stanley, 2007;Stanley et al, 2009), neurons are embedded into a substrate that has an externally specified topology. For example, the substrate can be a two-dimensional plane.…”
Section: The Neat Neuroevolution Methods and Derivativesmentioning
confidence: 99%
“…HyperNEAT (Gauci and Stanley, 2007;D'Ambrosio and Stanley, 2007;Stanley et al, 2009) is the most well known recent method to evolve large neural networks. In HyperNEAT, the number and position of neurons on a substrate is prespecified, while a NEAT-like compositional pattern producing network (CPPN) that gets the neuron coordinates as inputs specifies whether neurons are connected and what the weight of their connection is.…”
Section: Comparison To Other Methods For Evolving Large Networkmentioning
confidence: 99%
“…Such manual intervention is time consuming, requires expert knowledge, and adds constraints that may hurt performance. Previous work has shown that the Hypercube-based NeuroEvolution of Augmenting Topologies (HyperNEAT) generative encoding [20] can automatically generate a variety of regular gaits that outperform gaits evolved with direct encodings [4,7]. However, that work only verified these claims in simulation.…”
Section: Introductionmentioning
confidence: 94%
“…The first generative encoding scheme evaluated is a simplified version of HyperNEAT indirect encoding 2 (Stanley et al, 2009;Clune et al, 2011;Yosinski et al, 2011). The CPPNs encode the weights of a fixed topology, single-layer feedforward ANN, featuring 2-D cartesian grids of inputs, hidden and output neurons (Fig.…”
Section: Encoding Anns With Cppns (Minimal Hyperneat)mentioning
confidence: 99%
“…Evolvability is analyzed for a parametrized direct encoding, and the generative encoding of artificial neural networks (similar to HyperNEAT, Stanley et al (2009)) and single-unit pattern generators (Morse et al, 2013), in three independent experiments. The significance of our measure of evolvability is analyzed by the ability of the robot to adapt to previously unencountered changes in its morphology.…”
Section: Introductionmentioning
confidence: 99%