2019
DOI: 10.1038/s41598-019-54137-7
|View full text |Cite
|
Sign up to set email alerts
|

Deterministic networks for probabilistic computing

Abstract: Neuronal network models of high-level brain functions such as memory recall and reasoning often rely on the presence of some form of noise. The majority of these models assumes that each neuron in the functional network is equipped with its own private source of randomness, often in the form of uncorrelated external noise. In vivo, synaptic background input has been suggested to serve as the main source of noise in biological neuronal networks. However, the finiteness of the number of such noise sources consti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 16 publications
(15 citation statements)
references
References 74 publications
(164 reference statements)
0
15
0
Order By: Relevance
“…The additive noise ξ represents white noise with variance 2Cλ e , arising, for example, from unspecific background inputs [24,25]. For fixed presynaptic activity r, the average somatic membrane potential hence represents a maximum-a-posteriori estimate (MAP, [16]), while its variance is inversely proportional…”
Section: Bayesian Neuronal Dynamicsmentioning
confidence: 99%
“…The additive noise ξ represents white noise with variance 2Cλ e , arising, for example, from unspecific background inputs [24,25]. For fixed presynaptic activity r, the average somatic membrane potential hence represents a maximum-a-posteriori estimate (MAP, [16]), while its variance is inversely proportional…”
Section: Bayesian Neuronal Dynamicsmentioning
confidence: 99%
“…This is because the response variability of cortical neurons observed in electrophysiological recordings has been well-explained in terms of probabilistic computation (Shadlen and Newsome, 1998). To date, stochastic computing algorithms based on restricted Boltzmann machine (Jordan et al, 2019) and Bayesian inference (Sountsov and Miller, 2015) have exhibited remarkable advantages in edge detection (Joe and Kim, 2019), traffic prediction (Sun X. et al, 2020), and the complex prediction of protein functions (Zou et al, 2017). However, the existing stochastic neural networks remain at quasistochastic states and are accelerated by the central processing unit or graphic processing unit.…”
Section: Introductionmentioning
confidence: 99%
“…It is straightforward to show that the CNOT-gate cannot be realized on the operator level. A realization on the operator level would require relations of the type (39) on the level of the classical operators B (µν) and A (µν) , which amounts to [S, A (30) ] = [S, A (01) ] = [S, A (31) ] = 0 , SA (11) = A (10) S , SA (10) = A (11) S , SA (21) = A (20) S , SA (20) = A (21) S , SA (33) = A (03) S , SA (03) = A (33) S , SA (32) = A (02) S , SA (02) = A (32) S , SA (22) = −A (13) S , SA (13) = −A (22) S , SA (12) = A (23) S , SA (23) = A (12) S .…”
Section: Cnot-gate In Probabilistic Computingmentioning
confidence: 99%
“…Since neural networks operate classically, this amounts to realizing quantum statistics by classical statistical systems. Neuromorphic computing [11][12][13][14][15][16][17], or computational steps in our brain, may also realize quantum operations based on classical statistics. Learning by evolution, living systems may be able to perform quantum computational steps without realizing conditions that are often assumed to be necessary for quantum mechanics, as the presence of small and well isolated systems or low temperature.…”
Section: Introductionmentioning
confidence: 99%