2023
DOI: 10.1109/tcad.2023.3274918
|View full text |Cite
|
Sign up to set email alerts
|

SpikeSim: An End-to-End Compute-in-Memory Hardware Evaluation Tool for Benchmarking Spiking Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 12 publications
(3 citation statements)
references
References 41 publications
0
3
0
Order By: Relevance
“…It can be seen that SNNs consume up to 94% less energy than ANNs, which could largely promote the battery life in smart devices. However, in the image processing domain, SNNs may have higher data-moving energy because they need to store the membrane potential and access them in the future (Yin et al, 2022 , 2023 ; Moitra et al, 2023 ). We demonstrate that, in the HAR domain, SNNs have even lower data-moving energy than ANNs.…”
Section: Methodsmentioning
confidence: 99%
“…It can be seen that SNNs consume up to 94% less energy than ANNs, which could largely promote the battery life in smart devices. However, in the image processing domain, SNNs may have higher data-moving energy because they need to store the membrane potential and access them in the future (Yin et al, 2022 , 2023 ; Moitra et al, 2023 ). We demonstrate that, in the HAR domain, SNNs have even lower data-moving energy than ANNs.…”
Section: Methodsmentioning
confidence: 99%
“…With efficient storage of synaptic weights and specialized memory technologies, neuromorphic architectures provide scalable and brain-inspired computing solutions. Tool development to evaluate this hardware is also an active field of research, one such example being the work of [ 51 ] which introduced SpikeSim, a platform for an end-to-end compute-in-memory benchmarking tool to compare different SNN models on a chip for their power and latency efficiency.…”
Section: Neuromorphic Computing: Hardware Vs Softwarementioning
confidence: 99%
“…Digital implementations of SNNs are often favored for their compatibility with existing digital systems, and their performance can be reliably estimated based on familiar digital frameworks, making them a popular choice. [ 7 ] While digital SNN systems are considerably more energy‐efficient than conventional von Neumann architecture and offer predictability when compared to analog SNN systems, it is important to note that their reliance on synchronous processing based on an intrinsic digital architecture sets them apart from the human brain and can lead to different results in their speed and power consumption. To date, numerous studies have employed fully analog systems with analog synapses and integrate‐and‐fire (I&F) neurons to implement SNNs.…”
Section: Introductionmentioning
confidence: 99%