2016
DOI: 10.1038/ncomms12611
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised learning in probabilistic neural networks with multi-state metal-oxide memristive synapses

Abstract: In an increasingly data-rich world the need for developing computing systems that cannot only process, but ideally also interpret big data is becoming continuously more pressing. Brain-inspired concepts have shown great promise towards addressing this need. Here we demonstrate unsupervised learning in a probabilistic neural network that utilizes metal-oxide memristive devices as multi-state synapses. Our approach can be exploited for processing unlabelled data and can adapt to time-varying clusters that underl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
219
0
4

Year Published

2017
2017
2021
2021

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 303 publications
(224 citation statements)
references
References 52 publications
1
219
0
4
Order By: Relevance
“…Fig. 7 reveals the switching rate surface reproduced by the proposed model expressions (8) and the parameter values (9), (10) along with the switching rate measurements that resulted by the application of the optimizer routine on the DUT. The particularly good matching of the proposed expression (8) versus experimental results corroborates the assumptions upon which our model was structured.…”
Section: ) Switching Sensitivity Functionmentioning
confidence: 99%
See 3 more Smart Citations
“…Fig. 7 reveals the switching rate surface reproduced by the proposed model expressions (8) and the parameter values (9), (10) along with the switching rate measurements that resulted by the application of the optimizer routine on the DUT. The particularly good matching of the proposed expression (8) versus experimental results corroborates the assumptions upon which our model was structured.…”
Section: ) Switching Sensitivity Functionmentioning
confidence: 99%
“…What we propose is the straightforward integration of the state variable function (8) with the use of the built in VA time-domain integration operator "idt()". Numerical integration may produce errors in sensitive models with extremely long time constants but this operation is far more preferable than time-domain differentiation [8].…”
Section: B Verilog-a Coding Detailsmentioning
confidence: 99%
See 2 more Smart Citations
“…For each presentation of a pattern, the neuron with the largest output fires and claims the pattern, and only the 9 synaptic weights associated with this neuron are updated with a WTA rule [19]. Specifically, only when output neuron j has the largest output and fires (wins) are G OFFi,j (i = 1−9) updated: ΔG OFFi,j is increased by detrapping pulses if the input neuron i also fires or decreased by trapping pulses if the input neuron i does not fire.…”
Section: Theory and Simulationmentioning
confidence: 99%