2018
DOI: 10.3389/fninf.2018.00068
|View full text |Cite
|
Sign up to set email alerts
|

Code Generation in Computational Neuroscience: A Review of Tools and Techniques

Abstract: Advances in experimental techniques and computational power allowing researchers to gather anatomical and electrophysiological data at unprecedented levels of detail have fostered the development of increasingly complex models in computational neuroscience. Large-scale, biophysically detailed cell models pose a particular set of computational challenges, and this has led to the development of a number of domain-specific simulators. At the other level of detail, the ever growing variety of point neuron models i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
51
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
3
2

Relationship

5
4

Authors

Journals

citations
Cited by 37 publications
(51 citation statements)
references
References 76 publications
0
51
0
Order By: Relevance
“…Brian 2, a complete rewrite of the Brian simulator, solves the apparent dichotomy between flexibility and performance using the technique of code generation, which transparently transforms high-level user-defined models into efficient compiled code (Goodman, 2010;Stimberg et al, 2014;Blundell et al, 2018). This generated code is inserted within the flow of the simulation script, which makes it compatible with the procedural approach.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Brian 2, a complete rewrite of the Brian simulator, solves the apparent dichotomy between flexibility and performance using the technique of code generation, which transparently transforms high-level user-defined models into efficient compiled code (Goodman, 2010;Stimberg et al, 2014;Blundell et al, 2018). This generated code is inserted within the flow of the simulation script, which makes it compatible with the procedural approach.…”
Section: Introductionmentioning
confidence: 99%
“…, but only in parts of the simulation. This technique is now being increasingly widely used in other simulators, seeBlundell et al (2018) for a review.In brief, from the high level abstract description of the model, we generate independent blocks of code (in C++ and other languages) that, when run in sequence, carry out the simulation. To generate this code, we make use of a combination of various techniques from symbolic mathematics and compilers that are available in third party Python libraries, as well as…”
mentioning
confidence: 99%
“…While the procedural connectivity presented in the previous section allows simulating models which would otherwise not fit into the memory of a GPU, there are additional problems when using code generation for models with a large number of neuron and synapse populations. GeNN and other SNN simulators which use code generation to generate all of their simulation code (21) (as opposed to, for example NESTML (22), which uses code generation only to generate neuron simulation code) generate seperate pieces of code for each population of neurons and synapses. This allows optimizations such as hard-coding constant parameters and, although generating code for models with many populations will result in large code size, C++ CPU code can easily be divided between multiple modules and compiled in parallel, minimizing the effects on build time.…”
Section: Resultsmentioning
confidence: 99%
“…www.nature.com/scientificreports www.nature.com/scientificreports/ The technique of code generation allows us to solve this apparent conflict, and has been used by both the GeNN and Brian simulators 9,10,19 as well as a number of other neural simulators 11 . In the case of GeNN, when writing a new model users need to write only a very small section of generic C++ code that defines how the variables of a neuron model are updated, and this is then inserted into a detailed template that allows that model to be simulated efficiently on a GPU.…”
Section: Discussionmentioning
confidence: 99%