2020 2nd IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS) 2020
DOI: 10.1109/aicas48895.2020.9073948
|View full text |Cite
|
Sign up to set email alerts
|

On-chip Few-shot Learning with Surrogate Gradient Descent on a Neuromorphic Processor

Abstract: Recent work suggests that synaptic plasticity dynamics in biological models of neurons and neuromorphic hardware are compatible with gradient-based learning [1]. Gradientbased learning requires iterating several times over a dataset, which is both time-consuming and constrains the training samples to be independently and identically distributed. This is incompatible with learning systems that do not have boundaries between training and inference, such as in neuromorphic hardware. One approach to overcome these… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
18
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 21 publications
(18 citation statements)
references
References 18 publications
(30 reference statements)
0
18
0
Order By: Relevance
“…Much work remains to be done on the spiking algorithm front, but steady progress is being made. Early efforts demonstrate digit recognition, fusion of visual and tactile perception [40], fusion of visual and EMG information [39], persistent attention and tracking [68], and online learning of gestures [102] using event-based sensors interfaced to Loihi.…”
Section: A Event-based Sensing and Perceptionmentioning
confidence: 99%
“…Much work remains to be done on the spiking algorithm front, but steady progress is being made. Early efforts demonstrate digit recognition, fusion of visual and tactile perception [40], fusion of visual and EMG information [39], persistent attention and tracking [68], and online learning of gestures [102] using event-based sensors interfaced to Loihi.…”
Section: A Event-based Sensing and Perceptionmentioning
confidence: 99%
“…Furthermore, neuromorphic-equipped medical devices will better protect users' sensitive medical data without cloud communication requirements. Moreover, implementation of novel meta learning algorithms, such as few-shot learning, on neuromorphic platforms will enable the rapid adaptation and real-time learning in these systems with a few data points and the least possible computation 39,49 . Example of such applications, where online learning and adaptation of a ML model is crucial, include autonomous driving, surgical robotics, personalized medicine, and precision diagnostic 39,[49][50][51][52] .…”
Section: Discussionmentioning
confidence: 99%
“…Since we present each pattern one after another, the presentation time for N-MNIST is 105 ms, equivalent to the time taken for one saccade in the N-MNIST dataset. For DvsGesture, we take only the first 1,450 ms of the pattern to classify the dataset, as has been done earlier (e.g., Stewart et al, 2020 ) and this is the presentation time. As presynaptic spike rates vary throughout pattern presentation, the output neuron must spike only at the end of the presentation (see Iyer and Basu, 2017 for more details).…”
Section: Spiking Neural Networkmentioning
confidence: 99%