2021
DOI: 10.1007/s00521-021-06824-8
|View full text |Cite
|
Sign up to set email alerts
|

A heuristic approach to the hyperparameters in training spiking neural networks using spike-timing-dependent plasticity

Abstract: The third type of neural network called spiking is developed due to a more accurate representation of neuronal activity in living organisms. Spiking neural networks have many different parameters that can be difficult to adjust manually to the current classification problem. The analysis and selection of coefficients’ values in the network can be analyzed as an optimization problem. A practical method for automatic selection of them can decrease the time needed to develop such a model. In this paper, we propos… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 33 publications
0
5
0
Order By: Relevance
“…The proposed approach involves two layers of base learners and a final meta-learner used to optimize the prediction accuracy. Other approaches involving a more accurate representation of neuronal activity use spiking neural networks [24]. To optimize the recognition rate, various heuristic algorithms including Cuckoo Search Algorithm, Grasshopper Optimization Algorithm, and Polar Bears Algorithm are used to compute the parameters of the spiking NN.…”
Section: Hybrid Methodsmentioning
confidence: 99%
“…The proposed approach involves two layers of base learners and a final meta-learner used to optimize the prediction accuracy. Other approaches involving a more accurate representation of neuronal activity use spiking neural networks [24]. To optimize the recognition rate, various heuristic algorithms including Cuckoo Search Algorithm, Grasshopper Optimization Algorithm, and Polar Bears Algorithm are used to compute the parameters of the spiking NN.…”
Section: Hybrid Methodsmentioning
confidence: 99%
“…The use of hyperparameters in a neural network will not always match the entire dataset variable, this is still a drawback in this feature [25]. Research conducted by [26] focus on making settings in the hyperparameter to find the best settings for the entire dataset used in the study because these features are not good for all datasets used. As for the training targets and test targets, flood event data are used which are defined as floods (1) and no floods (0).…”
Section: E Data Analysis Methodsmentioning
confidence: 99%
“…Recent NAS procedures can be categorized broadly into four classes: Heuristics based [18], [19], Reinforcement learningbased [20], evolutionary algorithm based [21], and gradientbased [22] approaches. Heuristics based approaches are experimental approaches, and hence, require a large computation time [19]. In [19], Polap et al experimented with several heuristic algorithm to tune the hyperparameters for a specific task.…”
Section: Related Workmentioning
confidence: 99%
“…Heuristics based approaches are experimental approaches, and hence, require a large computation time [19]. In [19], Polap et al experimented with several heuristic algorithm to tune the hyperparameters for a specific task. Subramanian et al applied heuristics to find out suitable hyperparameter values related to a specific problem of identification of diseases in leaves [18].…”
Section: Related Workmentioning
confidence: 99%