In this paper, we reviewed the recent trends on neuromorphic computing using emerging memory technologies. Two representative learning algorithms used to implement a hardwarebased neural network are described as a bio-inspired learning algorithm and software-based learning algorithm, in particular back-propagation. The requirements of the synaptic device to apply each algorithm were analyzed. Then, we reviewed the research trends of synaptic devices to implement an artificial neural network.
Hardware-based spiking neural networks (SNNs) inspired by a biological nervous system are regarded as an innovative computing system with very low power consumption and massively parallel operation. To train SNNs with supervision, we propose an efficient on-chip training scheme approximating backpropagation algorithm suitable for hardware implementation. We show that the accuracy of the proposed scheme for SNNs is close to that of conventional artificial neural networks (ANNs) by using the stochastic characteristics of neurons. In a hardware configuration, gated Schottky diodes (GSDs) are used as synaptic devices, which have a saturated current with respect to the input voltage. We design the SNN system by using the proposed on-chip training scheme with the GSDs, which can update their conductance in parallel to speed up the overall system. The performance of the on-chip training SNN system is validated through MNIST data set classification based on network size and total time step. The SNN systems achieve accuracy of 97.83% with 1 hidden layer and 98.44% with 4 hidden layers in fully connected neural networks. We then evaluate the effect of non-linearity and asymmetry of conductance response for long-term potentiation (LTP) and long-term depression (LTD) on the performance of the on-chip training SNN system. In addition, the impact of device variations on the performance of the on-chip training SNN system is evaluated.
A positive-feedback (PF) neuron device capable of threshold tuning and simultaneously processing excitatory (G +) and inhibitory (G-) signals is experimentally demonstrated to replace conventional neuron circuits, for the first time. Thanks to the PF operation, the PF neuron device with steep switching characteristics can implement integrate-and-fire (IF) function of neurons with low-energy consumption. The structure of the PF neuron device efficiently merges a gated PNPN diode and a single MOSFET. Integrateand-fire (IF) operation with steep subthreshold swing (SS < 1 mV/dec) is experimentally implemented by carriers accumulated in an n floating body of the PF neuron device. The carriers accumulated in the n floating body are discharged by an inhibitory signal applied to the merged FET. Moreover, the threshold voltage (Vth) of the proposed PF neuron is controlled by using a charge storage layer. The low-energy consuming PF neuron circuit (~ 0.62 pJ/spike) consists of one PF device and only five MOSFETs for the IF and reset operation. In a high-level system simulation, a deep-spiking neural network (D-SNN) based on PF neurons with four hidden layers (1024 neurons in each layer) shows high-accuracy (98.55%) during a MNIST classification task. The PF neuron device provides a viable solution for high-density and low-energy neuromorphic systems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.