2017
DOI: 10.1007/978-3-319-70136-3_83
|View full text |Cite
|
Sign up to set email alerts
|

An Efficient Hardware Architecture for Multilayer Spiking Neural Networks

Abstract: Abstract. Spiking Neural Network (SNN) is the most recent computational model that can emulate the behaviors of biological neuron system. This paper highlights and discusses an efficient hardware architecture for the hardware SNNs, which includes a layer-level tile architecture (LTA) for the neurons and synapses, and a novel routing architecture (NRA) for the interconnections between the neuron nodes. In addition, a visualization performance monitoring platform is designed, which is used as functional verifica… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
10
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(15 citation statements)
references
References 12 publications
0
10
0
Order By: Relevance
“…Another is mapping the search space into a low-dimensional one. For examples, using auto-encoder [25] or sparse coding [40]. However, these types of methods do not consider special properties of the search problem, e.g., the graph structure of the supernet.…”
Section: Discussionmentioning
confidence: 99%
“…Another is mapping the search space into a low-dimensional one. For examples, using auto-encoder [25] or sparse coding [40]. However, these types of methods do not consider special properties of the search problem, e.g., the graph structure of the supernet.…”
Section: Discussionmentioning
confidence: 99%
“…On the other hand, evaluating all candidate architectures in the extremely large NAS search space is nearly impossible. To overcome these challenges, various surrogate models 13,14 and search algorithms are proposed in NAS to reduce the computational resources for evaluating candidate architectures, and improve the efficiency of search algorithm.…”
Section: Introductionmentioning
confidence: 99%
“…On the other hand, evaluating all candidate architectures in the extremely large NAS search space is nearly impossible. To overcome these challenges, various surrogate models 13,14 and search algorithms are proposed in NAS to reduce the computational resources for evaluating candidate architectures, and improve the efficiency of search algorithm.Typically, various predictors 15,16 are developed to estimate the performance of candidate architectures. Train-free predictors 17,18 offer the advantage of providing architecture performance without requiring any training samples and have demonstrated high efficiency and effectiveness, but their performance are usually not good enough in practice.…”
mentioning
confidence: 99%
“…The concept of neural architecture search has emerged in response to the difficulties in designing deep learning models and time-consuming tuning of parameters [12]. Genetic algorithm [13], reinforcement learning [14] and Bayesian optimization algorithm [15] have been applied to the network architecture search process, and these methods have solved the problem of network architecture design to a certain extent, making it possible for models to automatically obtain the appropriate network architecture based on data information.…”
Section: Introductionmentioning
confidence: 99%