2024
DOI: 10.1609/aaai.v38i10.29020
|View full text |Cite
|
Sign up to set email alerts
|

Exploiting Symmetric Temporally Sparse BPTT for Efficient RNN Training

Xi Chen,
Chang Gao,
Zuowen Wang
et al.

Abstract: Recurrent Neural Networks (RNNs) are useful in temporal sequence tasks. However, training RNNs involves dense matrix multiplications which require hardware that can support a large number of arithmetic operations and memory accesses. Implementing online training of RNNs on the edge calls for optimized algorithms for an efficient deployment on hardware. Inspired by the spiking neuron model, the Delta RNN exploits temporal sparsity during inference by skipping over the update of hidden states from those inactiva… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 18 publications
(24 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?