1990
DOI: 10.1177/003754979005500203
|View full text |Cite
|
Sign up to set email alerts
|

SLONN: A Simulation Language for modeling of Neural Networks

Abstract: This paper presents a general purpose Simulation Language for modeling Of Neural Networks (SLONN) which has been implemented in our laboratory. Based on a new neuron model, SLONN can represent both spatial and temporal summation of a single neuron and synaptic plasticity. By introducing fork to describe a connection pattern between neurons and by using repetition connection, module type and module array to specify large networks, SLONN can be used to specify both small and large neural networks effectively. Th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

1992
1992
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 19 publications
(4 citation statements)
references
References 25 publications
0
4
0
Order By: Relevance
“…Our models are designed at the abstract level of the IDSM. In this, they are different from most of the related work cited in the introduction (Bi et al, 2017; Hong et al, 2020; Marsland et al, 1999; Wang & Hsu, 1990) that work on the level of neurons or synapses. One slight advantage is that our models, being more conceptual, can be easier to analyze and understand, but, as discussed now, an in depth understanding of our models can still be complex.…”
Section: Discussionmentioning
confidence: 63%
See 2 more Smart Citations
“…Our models are designed at the abstract level of the IDSM. In this, they are different from most of the related work cited in the introduction (Bi et al, 2017; Hong et al, 2020; Marsland et al, 1999; Wang & Hsu, 1990) that work on the level of neurons or synapses. One slight advantage is that our models, being more conceptual, can be easier to analyze and understand, but, as discussed now, an in depth understanding of our models can still be complex.…”
Section: Discussionmentioning
confidence: 63%
“…Our models are designed at the abstract level of the IDSM. In this, they are different from most of the related work cited in the introduction (Bi et al, 2017;Hong et al, 2020;Marsland et al, 1999;Wang & Hsu, 1990) that work on the level of neurons or synapses. One slight advantage is that Figure 13.…”
Section: Models Comparisonmentioning
confidence: 69%
See 1 more Smart Citation
“…Some researchers have established mathematical models of the habituation effects on the efficacy of a synapse, including Groves and Thompson [23], Stanley [24], and Wang and Hsu [25]. The model proposed by the Wang and Hsu considered a long-term memory, where the long-term memory means that an animal habituates more quickly to a stimulus to which it has been previously habituated.…”
mentioning
confidence: 99%