Proceedings of the 2014 Annual Conference on Genetic and Evolutionary Computation 2014
DOI: 10.1145/2576768.2598369
|View full text |Cite
|
Sign up to set email alerts
|

Guided self-organization in indirectly encoded and evolving topographic maps

Abstract: An important phenomenon seen in many areas of biological brains and recently in deep learning architectures is a process known as self-organization. For example, in the primary visual cortex, color and orientation maps develop based on lateral inhibitory connectivity patterns and Hebbian learning dynamics. These topographic maps, which are found in all sensory systems, are thought to be a key factor in enabling abstract cognitive representations. This paper shows for the first time that the Hypercube-based Neu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 33 publications
0
3
0
Order By: Relevance
“…The artificial counterpart of the mutation-selection process, namely, the evolutionary algorithm (EA), has been applied in numerous domains for decades, and “neuroevolution” refers to the application of EA to neural networks (Yao and Liu, 1998 ; Stanley et al, 2019 ; Galván and Mooney, 2021 ). Although the neuroevolution scheme simplified or omitted numerous aspects of the biological evolution process, it successfully captured the essentials and performed well in rediscovering the BNN properties (Risi and Stanley, 2014 ) and optimizing the ANN architecture (Liang et al, 2018 ; Zoph et al, 2018 ). In addition to structural connectivity, network architecture comprises the functional features of a network, such as the activation function of each neuron and its hyperparameters or initial synaptic weights.…”
Section: Optimization Strategy: Multiscale Credit Assignmentmentioning
confidence: 99%
“…The artificial counterpart of the mutation-selection process, namely, the evolutionary algorithm (EA), has been applied in numerous domains for decades, and “neuroevolution” refers to the application of EA to neural networks (Yao and Liu, 1998 ; Stanley et al, 2019 ; Galván and Mooney, 2021 ). Although the neuroevolution scheme simplified or omitted numerous aspects of the biological evolution process, it successfully captured the essentials and performed well in rediscovering the BNN properties (Risi and Stanley, 2014 ) and optimizing the ANN architecture (Liang et al, 2018 ; Zoph et al, 2018 ). In addition to structural connectivity, network architecture comprises the functional features of a network, such as the activation function of each neuron and its hyperparameters or initial synaptic weights.…”
Section: Optimization Strategy: Multiscale Credit Assignmentmentioning
confidence: 99%
“…Adaptive ES-HyperNEAT allows each individual synaptic connection, rather than neuron, to be standard or modulatory, thus introducing further design flexibility. Risi and Stanley (2014) showed how adaptive HyperNEAT can be seeded to produce a specific lateral connectivity pattern, thereby allowing the weights to self-organize to form a topographic map of the input space. The study shows that evolution can be seeded with specific plasticity mechanisms that can facilitate the evolution of specific types of learning.…”
Section: F Evolving Indirectly Encoded Plasticitymentioning
confidence: 99%
“…However, novel combinations of recent advances such as more advanced forms of local plasticity (e.g. neuromodulation [113]), hypothesis testing in distal reward learning [112], larger indirectly-encoded adaptive networks [91][92][93], methods that avoid deception inherent in evolving learning architectures [63,94], and learning of large behavioral repertoires [23], could allow the creating of learning networks for more complex domains such as games.…”
Section: E Combining Ne With Life-long Learningmentioning
confidence: 99%