2020 IEEE International Symposium on Circuits and Systems (ISCAS) 2020
DOI: 10.1109/iscas45731.2020.9180440
|View full text |Cite
|
Sign up to set email alerts
|

A 28-nm Convolutional Neuromorphic Processor Enabling Online Learning with Spike-Based Retinas

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
19
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 31 publications
(20 citation statements)
references
References 25 publications
1
19
0
Order By: Relevance
“…This furthermore highlights that edge computing is an ideal use case for biologically-motivated algorithms, as an out-of-the-box application of feedback-alignment- and target-propagation-based algorithms currently does not scale to complex datasets (see Bartunov et al, 2018 for a recent review). We demonstrate this claim in Frenkel et al ( 2020 ) with the design of an event-driven convolutional processor that requires only 16.8–% power and 11.8–% silicon area overheads for on-chip online learning, a record-low overhead that is specifically enabled by DRTP, thus highlighting its low cost for edge computing devices. Finally, as DRTP can also be formulated as a three-factor learning rule for biologically-plausible learning, it is suitable for embedded neuromorphic computing, in which high-density synaptic plasticity can currently not be achieved without compromising learning performance (Frenkel et al, 2019b , c ).…”
Section: Introductionsupporting
confidence: 56%
See 1 more Smart Citation
“…This furthermore highlights that edge computing is an ideal use case for biologically-motivated algorithms, as an out-of-the-box application of feedback-alignment- and target-propagation-based algorithms currently does not scale to complex datasets (see Bartunov et al, 2018 for a recent review). We demonstrate this claim in Frenkel et al ( 2020 ) with the design of an event-driven convolutional processor that requires only 16.8–% power and 11.8–% silicon area overheads for on-chip online learning, a record-low overhead that is specifically enabled by DRTP, thus highlighting its low cost for edge computing devices. Finally, as DRTP can also be formulated as a three-factor learning rule for biologically-plausible learning, it is suitable for embedded neuromorphic computing, in which high-density synaptic plasticity can currently not be achieved without compromising learning performance (Frenkel et al, 2019b , c ).…”
Section: Introductionsupporting
confidence: 56%
“…Therefore, as opposed to increasing the resources of shallow-trained networks, DRTP offers a low-overhead training algorithm operating on small network topologies, ideally suiting edge-computing hardware requirements. These claims are proven in silico in Frenkel et al ( 2020 ), where implementing DRTP in an event-driven convolutional processor requires only 16.8–% power and 11.8–% silicon area overheads and allows demonstrating a favorable accuracy-power-area tradeoff compared to both on-chip online- and off-chip offline-trained conventional machine learning accelerators on the MNIST dataset.…”
Section: Discussionmentioning
confidence: 94%
“…Most current neuromorphic systems are fully digital and typically allow one to simulate software-trained models without performance loss ( 38 , 39 ). This approach is flexible with regard to the SNN training schemes used ( 23 , 40 – 49 ), but to fully leverage recent advances in material sciences often requires dealing with analog or mixed-signal components ( 15 , 19 , 50 , 51 ).…”
Section: Discussionmentioning
confidence: 99%
“…More efficient memory access is realized on an FPGA prototype in [ 27 ], by using a novel memory arbiter. A 28 nm SCNN processor is fabricated in [ 28 ], holding one CONV layer of 10 kernels for feature extraction, one pooling layer for dimension reduction, and two FC layers for feature classification. A systolic SCNN inference engine is proposed in [ 29 ], and two CONV layers with one FC layer are instantiated.…”
Section: Introductionmentioning
confidence: 99%