Proceedings of the Neuro-Inspired Computational Elements Workshop 2020
DOI: 10.1145/3381755.3381760
|View full text |Cite
|
Sign up to set email alerts
|

On the computational power and complexity of Spiking Neural Networks

Abstract: The last decade has seen the rise of neuromorphic architectures based on artificial spiking neural networks, such as the SpiNNaker, TrueNorth, and Loihi systems. The massive parallelism and colocating of computation and memory in these architectures potentially allows for an energy usage that is orders of magnitude lower compared to traditional Von Neumann architectures. However, to date a comparison with more traditional computational architectures (particularly with respect to energy usage) is hampered by th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(12 citation statements)
references
References 15 publications
0
12
0
Order By: Relevance
“…Then more complex permutations of interaction will need to be explored. The high complexity has been a concern for using SNN in real-time applications [10], [86], and we will explore solutions to improve the computational efficiency.…”
Section: Discussionmentioning
confidence: 99%
“…Then more complex permutations of interaction will need to be explored. The high complexity has been a concern for using SNN in real-time applications [10], [86], and we will explore solutions to improve the computational efficiency.…”
Section: Discussionmentioning
confidence: 99%
“…VSA can be formulated with different types of vectors, namely those containing real, complex, or binary entries, as well as with the multivectors of geometric algebra. These flavors of VSA come under many different names: Holographic Reduced Representation (HRR) [Plate, 1995a], [Plate, 2003], Multiply-Add-Permute (MAP) [Gayler, 1998], Binary Spatter Codes [Kanerva, 1997], Sparse Binary Distributed Representations (SBDR) [Rachkovskij and Kussul, 2001], [Rachkovskij, 2001], Sparse Block-Codes [Laiho et al, 2015], [Frady et al, 2020b], Matrix Binding of Additive Terms (MBAT) [Gallant and Okaywe, 2013], Geometric Analogue of Holographic Reduced Representation (GAHRR) [Aerts et al, 2009], etc. All of these different models have similar computational properties -see [Frady et al, 2018b] and [Schlegel et al, 2020].…”
Section: Fundamentals Of Vsamentioning
confidence: 99%
“…Here the similarity of sequences is defined by the number of the same elements in the same sequential positions, where the sequences are aligned by their last elements. Evidently, this definition does not take into account the same elements in different positions, in contrast to, e.g., an edit distance of sequences [Levenshtein, 1966]. Note that the edit distance can be approximated by vectors of n-gram frequencies and their randomized versions akin to hypervectors (see, e.g., [Sokolov, 2007], [Hannagan et al, 2011]).…”
Section: A Computational Primitives Formalized In Vsamentioning
confidence: 99%
See 1 more Smart Citation
“…Neuromorphic computers, due to their low energy consumption, are prime candidates for co-processors and accelerators in extremely heterogeneous computing systems [42,25]. However, because of the absence of a sound computability and complexity theory for neuromorphic computing [43], we are faced with a murky understanding of the big-O runtimes [44] of neuromorphic algorithms and unable to compare them to their conventional counterparts. This is not the case for other post-Moore computing paradigms such as quantum computing [45,46,47], which has a well-founded literature on the theory of quantum computation [48,49].…”
Section: Related Workmentioning
confidence: 99%