2023
DOI: 10.1002/adma.202300023
|View full text |Cite
|
Sign up to set email alerts
|

Preventing Vanishing Gradient Problem of Hardware Neuromorphic System by Implementing Imidazole‐Based Memristive ReLU Activation Neuron

Abstract: With advances in artificial intelligent services, brain‐inspired neuromorphic systems with synaptic devices are recently attracting significant interest to circumvent the von Neumann bottleneck. However, the increasing trend of deep neural network parameters causes huge power consumption and large area overhead of a nonlinear neuron electronic circuit, and it incurs a vanishing gradient problem. Here, a memristor‐based compact and energy‐efficient neuron device is presented to implement a rectifying linear uni… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(3 citation statements)
references
References 44 publications
0
3
0
Order By: Relevance
“…Moreover, as the number of hidden layers in the multilayer perceptron structure of neural networks increases, the neuron circuits that interconnect the pre- and post-layers become more complicated, which degrades the area and power efficiencies of computing systems based on neural networks 21 – 24 . The activation function of neuronal circuits is simplified or merged into a memory array to alleviate degradation 25 27 .…”
Section: Introductionmentioning
confidence: 99%
“…Moreover, as the number of hidden layers in the multilayer perceptron structure of neural networks increases, the neuron circuits that interconnect the pre- and post-layers become more complicated, which degrades the area and power efficiencies of computing systems based on neural networks 21 – 24 . The activation function of neuronal circuits is simplified or merged into a memory array to alleviate degradation 25 27 .…”
Section: Introductionmentioning
confidence: 99%
“…Meanwhile, achieving high accuracy in multifunctional DNNs using the reported devices and hardware network structures remains huge challenging. The main reason is that the majority of networks rely on backpropagation (BP) for weight updates 22 25 , yet the layer-by-layer structure leads to gradient vanishing (exploding), making it difficult to effectively train the network 26 29 . Although the most advanced neuromorphic chips have attempted to construct DNNs with cross-layer transmission, they still rely on repeated reading of digital memory and DAC/ADC to achieve parallel output of results 30 32 .…”
Section: Introductionmentioning
confidence: 99%
“…Most of existing neuromorphic devices cannot be reconfigured to fulfill the diverse run-time requirements, and hence depend on tailored designs specific to targeted applications. [26,27] For example, neurons for specific activation functions, [28,29] artificial dendrites, [30] and physical reservoir computing [31] are difficult to be reconfigured thus far which play a vital role in neuromorphic computing. Furthermore, energy-and area-efficient neuromorphic hardware imposes stringent requirements for the integration of multiple sophisticated brain-like functions in an all-in-one manner.…”
Section: Introductionmentioning
confidence: 99%