2016
DOI: 10.1016/j.neuron.2016.02.009
|View full text |Cite
|
Sign up to set email alerts
|

Recurrent Network Models of Sequence Generation and Memory

Abstract: SUMMARY Sequential activation of neurons is a common feature of network activity during a variety of behaviors, including working memory and decision making. Previous network models for sequences and memory emphasized specialized architectures in which a principled mechanism is pre-wired into their connectivity. Here, we demonstrate that starting from random connectivity and modifying a small fraction of connections, a largely disordered recurrent network can produce sequences and implement working memory effi… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

13
432
1

Year Published

2016
2016
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 345 publications
(466 citation statements)
references
References 74 publications
13
432
1
Order By: Relevance
“…Nevertheless, even in its crudest form, the Potts network with its latching dynamics can be used to explore e.g., novel theories as to the evolutionary origin of complex cognition [39]. It establishes a quantitative framework to understand phase transitions [25], complementary to the perspective offered by other modelling approaches to sequence generation in cortical networks [40]. At the most abstract level, it can be considered an implementation of a fuzzy logic system [41,42], but with the critical advantage that its parameters can eventually be related to cortical parameters, as we begin to describe in a related study [32].…”
Section: Discussionmentioning
confidence: 99%
“…Nevertheless, even in its crudest form, the Potts network with its latching dynamics can be used to explore e.g., novel theories as to the evolutionary origin of complex cognition [39]. It establishes a quantitative framework to understand phase transitions [25], complementary to the perspective offered by other modelling approaches to sequence generation in cortical networks [40]. At the most abstract level, it can be considered an implementation of a fuzzy logic system [41,42], but with the critical advantage that its parameters can eventually be related to cortical parameters, as we begin to describe in a related study [32].…”
Section: Discussionmentioning
confidence: 99%
“…[31] [24], [3], [32], [28]), but the networks in these studies were all trained in a deterministic setting. An important recent development in deep learning has been the advent of the variational auto-encoder [19] [29], which combines a probabilistic framework with the power and ease of optimization of deep learning methods.…”
Section: Lfads Related Work In Machine Learning Literaturementioning
confidence: 99%
“…http://dx.doi.org/10.1101/152884 doi: [28]). It should be stressed that the vanilla RNN used as the data RNN here does not have the same functional form as the network generator used in the LFADS framework, which is a GRU (see section 1.6).…”
mentioning
confidence: 99%
“…In recent years, there has been renewed interest in modeling complex human behaviors such as memory and motor skills using neural networks (Sussillo et al 2015;Rajan, Harvey, and Tank 2016;Hennequin, Vogels, and Gerstner 2014;Carnevale et al 2015;Laje, Buonomano, and Buonomano 2013). However, training these networks to produce meaningful behavior has proven difficult.…”
Section: Discussionmentioning
confidence: 99%