2022
DOI: 10.3390/electronics11101610
|View full text |Cite
|
Sign up to set email alerts
|

Nonvolatile Memories in Spiking Neural Network Architectures: Current and Emerging Trends

Abstract: A sustainable computing scenario demands more energy-efficient processors. Neuromorphic systems mimic biological functions by employing spiking neural networks for achieving brain-like efficiency, speed, adaptability, and intelligence. Current trends in neuromorphic technologies address the challenges of investigating novel materials, systems, and architectures for enabling high-integration and extreme low-power brain-inspired computing. This review collects the most recent trends in exploiting the physical pr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(8 citation statements)
references
References 190 publications
1
7
0
Order By: Relevance
“…These results of SNN experiments are similar to ANN training results by [25]. These ANN experiments found that the suitable range for Caputo derivative orders is between 6 9 and 8 9 .…”
Section: Uci Dataset Resultssupporting
confidence: 80%
See 2 more Smart Citations
“…These results of SNN experiments are similar to ANN training results by [25]. These ANN experiments found that the suitable range for Caputo derivative orders is between 6 9 and 8 9 .…”
Section: Uci Dataset Resultssupporting
confidence: 80%
“…As described in Section 2, Caputo derivative can be utilized to calculate the partial derivative of the Tempotron cost function with respect to synaptic weights. To acquire the equation by which each weight could be updated using this novel Caputo-based Tempotron (Caputron) optimizer, first the Caputo derivative of the cost function should be considered over a restricted interval of the weight values [c, w ji ], and with a derivative order between 0 and 1, similarly to Equation (8).…”
Section: Caputron Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…The idea of integrating analog spikes or stimuli is an extension of the summation scheme, which is a vital aspect for integrate‐and‐fire neurons in a spiking neural network (SNN) (Figure 7c). [ 385,386 ] The idea of an artificial synapse that obtains spike signals from various presynaptic inputs has been demonstrated. For example, in PCM layers, input spikes are combined and computed through the Kirchhoff's law of summing currents at a neuron circuitry source (Figure 7d).…”
Section: Mm‐based In‐memory Computingmentioning
confidence: 99%
“…The implementation of parallelism has the potential to greatly enhance the efficiency of both neural network training and inference activities. 229…”
Section: Electrochemical-memristor-based Artificial Neurons and Synapsesmentioning
confidence: 99%