2018
DOI: 10.1109/jetcas.2018.2816339
|View full text |Cite
|
Sign up to set email alerts
|

Low-Power, Adaptive Neuromorphic Systems: Recent Progress and Future Directions

Abstract: In this paper, we present a survey of recent works in developing neuromorphic or neuro-inspired hardware systems.In particular, we focus on those systems which can either learn from data in an unsupervised or online supervised manner. We present algorithms and architectures developed specially to support on-chip learning. Emphasis is placed on hardware friendly modifications of standard algorithms, such as backpropagation, as well as novel algorithms, such as structural plasticity, developed specially for low-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
56
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
4
2

Relationship

2
8

Authors

Journals

citations
Cited by 97 publications
(57 citation statements)
references
References 177 publications
(244 reference statements)
1
56
0
Order By: Relevance
“…Several models describe the neuron functions, with a varying degree of complexity [52,53,75]. A large amount of them have already been transposed onto hardware [11,20,31,73,198]. The Leaky Integrate and Fire (LIF) model has encountered a special interest among hardware designers [1,15,38,122], we thus discuss its working principle below.…”
Section: Computation In the Brainmentioning
confidence: 99%
“…Several models describe the neuron functions, with a varying degree of complexity [52,53,75]. A large amount of them have already been transposed onto hardware [11,20,31,73,198]. The Leaky Integrate and Fire (LIF) model has encountered a special interest among hardware designers [1,15,38,122], we thus discuss its working principle below.…”
Section: Computation In the Brainmentioning
confidence: 99%
“…Over the past decade, GPUs have emerged as a major hardware resource for deep learning tasks. However, fields, such as the internet of things (IoT) and edge computing are constantly in need of more efficient neural-network-specific hardware (Basu et al, 2018;Deng et al, 2018;Alyamkin et al, 2019;Roy et al, 2019). This encourages competition among companies, such as Intel, IBM, and others to propose new hardware alternatives, leading to the emergence of commercially available deep learning accelerators (Barry et al, 2015;Jouppi et al, 2017) and neuromorphic chips (Esser et al, 2016;Davies et al, 2018;Pei et al, 2019).…”
Section: Introductionmentioning
confidence: 99%
“…The latter is a famous method used to modify the strength of synapses depending on the relative times of spiking of the involved neurons [18]. The memory element is a floating gate that stores quasi-permanently a charge and it is one of the main candidates for neuromorphic circuits [19]- [21] thanks to the full compatibility with the current CMOS technology. Differently from previously reported floating gate synapses [14], [20]- [22], the stored charge is modified through Fowler-Nordheim tunneling effect [23] avoiding the large current required by hot-carrier injection [24].…”
Section: Introductionmentioning
confidence: 99%