2023
DOI: 10.1126/sciadv.adi1480
|View full text |Cite
|
Sign up to set email alerts
|

SpikingJelly: An open-source machine learning infrastructure platform for spike-based intelligence

Wei Fang,
Yanqi Chen,
Jianhao Ding
et al.

Abstract: Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency by introducing neural dynamics and spike properties. As the emerging spiking deep learning paradigm attracts increasing interest, traditional programming frameworks cannot meet the demands of the automatic differentiation, parallel computation acceleration, and high integration of processing neuromorphic datasets and deployment. In this work, we present the SpikingJelly framework to addre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 47 publications
(19 citation statements)
references
References 155 publications
0
18
0
Order By: Relevance
“…where Ω is the number of time steps (we set it to Ω = 16 based on [11]), P is the polarity, H is the height, and W is the width.…”
Section: Methodsmentioning
confidence: 99%
“…where Ω is the number of time steps (we set it to Ω = 16 based on [11]), P is the polarity, H is the height, and W is the width.…”
Section: Methodsmentioning
confidence: 99%
“…A variety of gradient-based SNN libraries have been open-sourced, most of which are written in Python for syntactical ease, and several of which are built on top of commonplace deep learning packages [25,[53][54][55][56][57]. Most approaches compose primitive functions together wrapped as a spiking neuron node, where gradients are analytically calculated using reverse autodifferentiation in the backend.…”
Section: Snntorchmentioning
confidence: 99%
“…Several alternative options are available for accelerating SNNs using CUDA-based libraries. SpikingJelly provides a CuPy backend [55], GeNN uses CUDA-generated code to implement an approximate form of BPTT [45,58], and lava-dl incorporates the most commonly used functions/neurons as optimized CUDA code, while other libraries mostly depend on the deep learning package's CUDA acceleration.…”
Section: Snntorchmentioning
confidence: 99%
“…The higher the spiking frequency, the higher the probability that the input sample belongs to the corresponding category of the spiking neuron. We used the deep learning framework PyTorch and the spiking neural network framework SpikingJelly 38 to establish the RCSNN-12 model. After conducting many preliminary experiments, we optimized the parameters and found a set of parameters that performed well.…”
Section: Sensor Typementioning
confidence: 99%