2018
DOI: 10.1162/neco_a_01057
|View full text |Cite
|
Sign up to set email alerts
|

A Unifying Framework of Synaptic and Intrinsic Plasticity in Neural Populations

Abstract: A neuronal population is a computational unit that receives a multivariate, time-varying input signal and creates a related multivariate output. These neural signals are modeled as stochastic processes that transmit information in real time, subject to stochastic noise. In a stationary environment, where the input signals can be characterized by constant statistical properties, the systematic relationship between its input and output processes determines the computation carried out by a population. When these … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
10
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 11 publications
(12 citation statements)
references
References 36 publications
0
10
0
Order By: Relevance
“…These form observations speak to the biological plausibility of the variational message passing scheme used to simulate neural responses in this paper. Although several biologically plausible blind source separation methods in the continuous state space have been developed 44 48 , to our knowledge, this is the first attempt to explain neuronal blind source separation using a biologically plausible learning algorithm in the discrete (binary) state space.…”
Section: Discussionmentioning
confidence: 99%
“…These form observations speak to the biological plausibility of the variational message passing scheme used to simulate neural responses in this paper. Although several biologically plausible blind source separation methods in the continuous state space have been developed 44 48 , to our knowledge, this is the first attempt to explain neuronal blind source separation using a biologically plausible learning algorithm in the discrete (binary) state space.…”
Section: Discussionmentioning
confidence: 99%
“…This intrinsic probabilistic property of memristive devices can be exploited for implementing stochastic learning in neuromorphic architectures 43,44,48,[64][65][66] , which in turn can be used to implement faithful models of biological cortical microcircuits 67,68 , solve memory capacity and classification problems in artificial neural network applications 69,70 , and reduce the network sensitivity to their variability 43 . Recent results on stochastic learning modulated by regularization mechanisms, such as homeostasis or intrinsic plasticity 44,[71][72][73] , present an excellent potential for exploiting the features memristive devices, even when restricted to binary values. e. Don't (hard) limit your devices.…”
mentioning
confidence: 99%
“…Nonetheless, the manner in which the brain can possibly solve a nonlinear BSS problem remains unclear, even though it might be a prerequisite for many of its cognitive processes such as visual recognition (DiCarlo et al, 2012). While Oja's subspace rule for PCA (Oja, 1989) and Amari's ICA algorithm (Amari et al, 1996) were used in this article, these rules can be replaced with more biologically plausible local Hebbian learning rules (Foldiak, 1990;Linsker, 1997;Pehlevan, Mohan, & Chklovskii, 2017;Isomura & Toyoizumi, 2016Leugering & Pipa, 2018) that require only directly accessible signals to update synapses. A recent work indicated that even a single-layer neural network can perform both PCA and ICA through a local learning rule (Isomura & Toyoizumi, 2018), implying that even a single-layer network can perform a nonlinear BSS.…”
Section: Discussionmentioning
confidence: 99%
“…Our theorems indicate that a combination of PCA and ICA algorithms can reliably separate hidden sources that are nonlinearly mixed in the environment, when sufficiently rich sensory inputs are provided. While we have used Oja's subspace rule for PCA and Amari's ICA rule in this paper, more biologically plausible local Hebbian learning rules are proposed [59][60][61][62][63][64]. A recent work showed that even a single-layer neural network can perform both PCA and ICA through a local learning rule [63].…”
Section: Discussionmentioning
confidence: 99%