2016
DOI: 10.1016/j.cobeha.2016.05.012
|View full text |Cite
|
Sign up to set email alerts
|

Does computational neuroscience need new synaptic learning paradigms?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
35
0
1

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 38 publications
(37 citation statements)
references
References 108 publications
1
35
0
1
Order By: Relevance
“…We found that if one combines local rules for synaptic plasticity with learning signals from a separate network that is optimized for inducing fast learning of RSNNs, one-shot learning becomes feasible in a biologically realistic manner. This solves a well-known open problem in computational neuroscience (Brea and Gerstner, 2016). We have also shown that RSNNs can learn in this way to estimate posterior probabilities, thereby providing a new basis for modeling probabilistic computing and learning in RSNNs.…”
Section: Discussionmentioning
confidence: 65%
See 1 more Smart Citation
“…We found that if one combines local rules for synaptic plasticity with learning signals from a separate network that is optimized for inducing fast learning of RSNNs, one-shot learning becomes feasible in a biologically realistic manner. This solves a well-known open problem in computational neuroscience (Brea and Gerstner, 2016). We have also shown that RSNNs can learn in this way to estimate posterior probabilities, thereby providing a new basis for modeling probabilistic computing and learning in RSNNs.…”
Section: Discussionmentioning
confidence: 65%
“…Application 1: One-shot learning of new arm movements Brea and Gerstner (2016) argue that one-shot learning is one of two really important learning capabilities of the brain that are not yet satisfactorily explained by current models in computational neuroscience. We demonstrate that natural e-prop supports one-shot learning in two very different contexts.…”
Section: Introductionmentioning
confidence: 99%
“…We argue that the above mechanisms of local selforganization are likely insufficient to account for the brain's powerful learning performance (Brea and Gerstner, 2016). To elaborate on the need for an efficient means of gradient computation in the brain, we will first place backpropagation into it's computational context (Hinton, 1989;Baldi and Sadowski, 2015).…”
Section: Biological Implementation Of Optimizationmentioning
confidence: 99%
“…Efforts are underway to effectively train spiking neural networks (Gerstner et al, 2014;Gerstner and Kistler, 2002;O'Connor and Welling, 2016;Huh and Sejnowski, 2017) and endow them with the same cognitive capabilities as their rate-based cousins Zambrano and Bohte, 2016;Kheradpisheh et al, 2016;Lee et al, 2016;Thalmeier et al, 2015). In the same vein, researchers are exploring how probabilistic computations can be performed in neural networks (Pouget et al, 2013;Nessler et al, 2013;Orhan and Ma, 2016;Heeger, 2017) and deriving new biologically plausible synaptic plasticity rules (Schiess et al, 2016;Brea and Gerstner, 2016). Biologically-inspired principles may also be incorporated at a more conceptual level.…”
Section: Next-generation Artificial Neural Networkmentioning
confidence: 99%