2018 Conference on Cognitive Computational Neuroscience 2018
DOI: 10.32470/ccn.2018.1086-0
|View full text |Cite
|
Sign up to set email alerts
|

Short-term Sequence Memory: Compressive effects of Recurrent Network Dynamics

Abstract: Neural networks have become a ubiquitous as cognitive models in neuroscience and as machine learning systems. Deep neural networks in particular are achieving near-human performance in many applications. More recently, recurrent neural networks (RNNs) are being increasingly utilized, both as standalone structures and as layers of deep networks. RNNs are especially interesting as cortical networks are recurrent, indicating that recurrent connections are important in human-level processing. Despite their growing… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
1
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 23 publications
0
1
0
Order By: Relevance
“…Nowadays, RNNs are gaining renewed interest in neuroscience due to their biological plausibility78910 and in computer science and engineering for their modeling ability1112. RNNs are capable to generate complex dynamics and perform inference based on current inputs and internal state, the latter maintaining a vanishing memory of past inputs1314.…”
mentioning
confidence: 99%
“…Nowadays, RNNs are gaining renewed interest in neuroscience due to their biological plausibility78910 and in computer science and engineering for their modeling ability1112. RNNs are capable to generate complex dynamics and perform inference based on current inputs and internal state, the latter maintaining a vanishing memory of past inputs1314.…”
mentioning
confidence: 99%