2013 Asilomar Conference on Signals, Systems and Computers 2013
DOI: 10.1109/acssc.2013.6810296
|View full text |Cite
|
Sign up to set email alerts
|

A neuron as a signal processing device

Abstract: Abstract-A neuron is a basic physiological and computational unit of the brain. While much is known about the physiological properties of a neuron, its computational role is poorly understood. Here we propose to view a neuron as a signal processing device that represents the incoming streaming data matrix as a sparse vector of synaptic weights scaled by an outgoing sparse activity vector. Formally, a neuron minimizes a cost function comprising a cumulative squared representation error and regularization terms.… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2014
2014
2019
2019

Publication Types

Select...
4
3

Relationship

5
2

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 28 publications
0
10
0
Order By: Relevance
“…By summing these activities with the weights of corresponding synapses a neuron projects each data sample onto the vector of synaptic weights and transmits the projection to downstream neurons via its output activity. If synaptic weights are updated after each data sample presentation according to the Oja learning rule, a neuron computes the top eigenvector of the covariance matrix and outputs the first principle component [4], [5]. Here, we ignore temporal correlations in activity and assume that the dataset is presented as a sequence of static "snapshots" streamed in an arbitrary order.…”
Section: Introductionmentioning
confidence: 99%
“…By summing these activities with the weights of corresponding synapses a neuron projects each data sample onto the vector of synaptic weights and transmits the projection to downstream neurons via its output activity. If synaptic weights are updated after each data sample presentation according to the Oja learning rule, a neuron computes the top eigenvector of the covariance matrix and outputs the first principle component [4], [5]. Here, we ignore temporal correlations in activity and assume that the dataset is presented as a sequence of static "snapshots" streamed in an arbitrary order.…”
Section: Introductionmentioning
confidence: 99%
“…One approach toward understanding synaptic function is from the theory of dynamical systems (Spruston, 2008; Hu et al, 2013). This includes knowledge for modeling the electrical and physiological properties of a neuronal cell and its reliability in communicating information across synaptic connections.…”
Section: Resultsmentioning
confidence: 99%
“…are optima of (3), where (5) uniquely defines all optima of (3), except when k < m, λ X k > αT and λ X k = λ X k+1 .…”
Section: Soft-thresholding Of Covariance Eigenvaluesmentioning
confidence: 99%
“…In the second phase of the algorithm, synaptic weights are updated for feedforward connections according to a local Hebbian rule (15) and for lateral connections according to a local anti-Hebbian rule (due to the (−) sign in equation ( 13)). Interestingly, in the α = 0 limit, these updates have the same form as the single-neuron Oja rule [24,2], except that the learning rate is not a free parameter but is determined by the cumulative neuronal activity 1/D Y T +1,i [4,5].…”
Section: Online Soft-thresholding Of Eigenvaluesmentioning
confidence: 99%
See 1 more Smart Citation