In this paper, we present a survey of recent works in developing neuromorphic or neuro-inspired hardware systems.In particular, we focus on those systems which can either learn from data in an unsupervised or online supervised manner. We present algorithms and architectures developed specially to support on-chip learning. Emphasis is placed on hardware friendly modifications of standard algorithms, such as backpropagation, as well as novel algorithms, such as structural plasticity, developed specially for low-resolution synapses. We cover works related to both spike-based and more traditional non-spike-based algorithms. This is followed by developments in novel devices, such as floating-gate MOS, memristors, and spintronic devices. CMOS circuit innovations for on-chip learning and CMOS interface circuits for post-CMOS devices, such as memristors, are presented. Common architectures, such as crossbar or island style arrays, are discussed, along with their relative merits and demerits. Finally, we present some possible applications of neuromorphic hardware, such as brain-machine interfaces, robotics, etc., and identify future research trends in the field.
Shallow feed-forward networks are incapable of addressing complex tasks such as natural language processing that require learning of temporal signals. To address these requirements, we need deep neuromorphic architectures with recurrent connections such as deep recurrent neural networks. However, the training of such networks demand very high precision of weights, excellent conductance linearity and low write-noise-not satisfied by current memristive implementations. Inspired from optogenetics, here we report a neuromorphic computing platform comprised of photo-excitable neuristors capable of in-memory computations across 980 addressable states with a high signal-to-noise ratio of 77. The large linear dynamic range, low write noise and selective excitability allows high fidelity opto-electronic transfer of weights with a two-shot write scheme, while electrical in-memory inference provides energy efficiency. This method enables implementing a memristive deep recurrent neural network with twelve trainable layers with more than a million parameters to recognize spoken commands with >90% accuracy.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.