2016
DOI: 10.1038/srep21142
|View full text |Cite
|
Sign up to set email alerts
|

Recurrent Spiking Networks Solve Planning Tasks

Abstract: A recurrent spiking neural network is proposed that implements planning as probabilistic inference for finite and infinite horizon tasks. The architecture splits this problem into two parts: The stochastic transient firing of the network embodies the dynamics of the planning task. With appropriate injected input this dynamics is shaped to generate high-reward state trajectories. A general class of reward-modulated plasticity rules for these afferent synapses is presented. The updates optimize the likelihood of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
90
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 56 publications
(92 citation statements)
references
References 57 publications
2
90
0
Order By: Relevance
“…This idea is in line with recent proposals in machine learning, [9] and [10], that use also semi-structured recurrent network models for planning. In comparison to them, we extend their results by adding a second structure along with the recurrent network, an associative map (AM), that will recursively and timely control it; see Fig 1b).…”
Section: Introductionsupporting
confidence: 65%
“…This idea is in line with recent proposals in machine learning, [9] and [10], that use also semi-structured recurrent network models for planning. In comparison to them, we extend their results by adding a second structure along with the recurrent network, an associative map (AM), that will recursively and timely control it; see Fig 1b).…”
Section: Introductionsupporting
confidence: 65%
“…These and other formal schemes may be used to test empirical predictions on how rodents solve challenging navigation tasks or even how humans solve abstract tasks . Compared with other computational proposals that also discuss hippocampal function in relation to generative models, we have stressed the idea that hippocampal coding and processing may be organized around sequences. Hippocampal generative models may be preconfigured for sequential processing and the rapid encoding of arbitrary sequences of events in ways that current machine learning techniques fail to model.…”
Section: Discussionmentioning
confidence: 99%
“…Finally, a series of computational studies explored the idea that various aspects of spatial cognition, including spatial decision making, route planning, model selection, vicarious trial and error (VTE), and the covert evaluation of future spatial trajectories, may be based on probabilistic inference and a common generative model (implemented in the hippocampus and surrounding structures), also discussing various neuronal implementations of (approximate) Bayesian inference …”
Section: A Computational Perspective On Igssmentioning
confidence: 99%
See 1 more Smart Citation
“…At the low level, the 125 model relies on the neural mechanisms and the unsupervised learning plasticity rule 126 December 2, 2019 4/28 proposed in [24] (plus a reinforcement learning rule) to implement a HMM with a 127spiking recurrent neural network. A recurrent spiking neural network to implement the 128 world model and planning was also used in [35,36]. With respect to these previous 129 models, our architecture presents a number of structure and functioning novelties at the 130 system level.…”
mentioning
confidence: 99%