2021
DOI: 10.15388/21-infor450
|View full text |Cite
|
Sign up to set email alerts
|

A Comparative Study of Stochastic Optimizers for Fitting Neuron Models. Application to the Cerebellar Granule Cell

Abstract: This work compares different algorithms to replace the genetic optimizer used in a recent methodology for creating realistic and computationally efficient neuron models. That method focuses on single-neuron processing and has been applied to cerebellar granule cells. It relies on the adaptive-exponential integrate-and-fire (AdEx) model, which must be adjusted with experimental data. The alternatives considered are: i) a memetic extension of the original genetic method, ii) Differential Evolution, iii) Teaching… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
2

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 33 publications
0
7
0
Order By: Relevance
“…The multi-start SASS is a especially descriptive comparison: Since Tangram can be seen as a rule-based multi-start component linked to SASS, it should serve as a better guide for this local solver rather than simply generating random starts. Finally, TLBO has been included in the comparison due to its fame of being simple to tune and effective [4]. It is also relevant to highlight that both TLBO and MSASS achieved very good results in the model tuning application described in [4], which supports their selection.…”
Section: Experimentation and Resultsmentioning
confidence: 87%
See 4 more Smart Citations
“…The multi-start SASS is a especially descriptive comparison: Since Tangram can be seen as a rule-based multi-start component linked to SASS, it should serve as a better guide for this local solver rather than simply generating random starts. Finally, TLBO has been included in the comparison due to its fame of being simple to tune and effective [4]. It is also relevant to highlight that both TLBO and MSASS achieved very good results in the model tuning application described in [4], which supports their selection.…”
Section: Experimentation and Resultsmentioning
confidence: 87%
“…Alternatively, if the function modeled the strength of the resulting product, the points of interest would presumably be the maxima of the corresponding function. One of the applications in which optimization stands out is model tuning, where the parameters become variables, and the objective function is the comparison between the achieved and desired output [4,12]. It allows automating processes that used to rely on experts and might be biased by them.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations