2005
DOI: 10.1371/journal.pcbi.0020165.eor
|View full text |Cite
|
Sign up to set email alerts
|

Computational aspects of feedback in neural circuits

Abstract: It has previously been shown that generic cortical microcircuit models can perform complex real-time computations on continuous input streams, provided that these computations can be carried out with a rapidly fading memory. We investigate the computational capability of such circuits in the more realistic case where not only readout neurons, but in addition a few neurons within the circuit, have been trained for specific tasks. This is essentially equivalent to the case where the output of trained readout neu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
57
0

Year Published

2006
2006
2017
2017

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 41 publications
(59 citation statements)
references
References 34 publications
2
57
0
Order By: Relevance
“…Moreover, under the additional assumption that the available processing time is finite, the same conclusion follows for the continuous-time case, if finite temporal precision or temporal noise is assumed. This is essentially what the technical results of Maass and colleagues (Maass & Orponen, 1997Maass & Sontag, 1999;Maass et al, 2007), as well as others (Casey, 1996;Siegelmann, 1999), entail.…”
Section: Finiteness Of Neural Systemsmentioning
confidence: 54%
See 1 more Smart Citation
“…Moreover, under the additional assumption that the available processing time is finite, the same conclusion follows for the continuous-time case, if finite temporal precision or temporal noise is assumed. This is essentially what the technical results of Maass and colleagues (Maass & Orponen, 1997Maass & Sontag, 1999;Maass et al, 2007), as well as others (Casey, 1996;Siegelmann, 1999), entail.…”
Section: Finiteness Of Neural Systemsmentioning
confidence: 54%
“…In passing, we note that a qualitative match between the performance of simple recurrent networks and human comprehension of nested (context-free) and crossed (context-sensitive) dependencies has been reported (Christiansen & Chater, 1999;Christiansen & MacDonald, 2009). Because (in a technical sense), noisy or discrete simple recurrent networks are finite-state architectures (Casey, 1996;Maass, Joshi, & Sontag, 2007;Maass & Orponen, 1998;Maass & Sontag, 1999;see also, Petersson, 2005b;Petersson, Grenholm, & Forkstam, 2005), these results suggest that actual language processing uses no more on-line memory resources than can be provided by a finite-state architecture. These simulations, of course, only illustrate that recurrent networks can handle (bounded) non-regular processing at some level of proficiency.…”
Section: Introductionmentioning
confidence: 63%
“…We present a theoretical framework for morphological computation, which is based on a result by Maass et al (2007). They proved that a certain class of nonlinear dynamical systems (which can have the property of fading memory) gain computational power to emulate arbitrary nonlinear systems (which can have persistent memory), by adding simply a suitable static (memoryless) feedback and a suitable static (memoryless) readout function.…”
Section: Theoretical Foundationsmentioning
confidence: 99%
“…The abstract idea of this concept is that a complex network of calculating identities (e.g., neurons) is so diverse that each task is solved somewhere within the network (Maass et al 2002;Buonomano and Maass 2009;Maass 2010). However, one problem with this approach is the capacity, which depends sublinearly on the number of neurons (Ganguli et al 2008); another problem is the read-out of the task-specific information from the network (Maass et al 2007;Legenstein et al 2008). …”
Section: Physiological Mechanismmentioning
confidence: 99%