2008
DOI: 10.1007/s11063-008-9089-6
|View full text |Cite
|
Sign up to set email alerts
|

A New Associative Model with Dynamical Synapses

Abstract: The brain is not a huge fixed neural network, but a dynamic, changing neural network that continuously adapts to meet the demands of communication and computational needs. In classical neural networks approaches, particularly associative memory models, synapses are only adjusted during the training phase. After this phase, synapses are no longer adjusted. In this paper we describe a new dynamical model where synapses of the associative memory could be adjusted even after the training phase as a response to an … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2009
2009
2018
2018

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(5 citation statements)
references
References 43 publications
0
5
0
Order By: Relevance
“…Different combinations for the genetic algorithm parameters were tested based on their meaning and on common literature recommended values [19][20][21][22], but the best results were achieved by the parameters described here.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Different combinations for the genetic algorithm parameters were tested based on their meaning and on common literature recommended values [19][20][21][22], but the best results were achieved by the parameters described here.…”
Section: Resultsmentioning
confidence: 99%
“…In [21], a 100% efficiency was achieved for this dataset. Table 19 shows the classification efficiencies for the Screw dataset using feedforward neural networks (FNN).…”
Section: Screw Datasetmentioning
confidence: 86%
“…The object recognition problem was taken from [26], and the spiral, synthetic 1, and synthetic 2 datasets were developed in our laboratory. The pattern dispersions of these datasets are shown in Figure 2.…”
Section: Tuning the Parameters For Pso Algorithmsmentioning
confidence: 99%
“…There are two kinds of signal approximation: one is the estimation of parameters [1][2][3][4][5][6][7][8][9][10][11][12][13][14][15][16][17][18], and the other is the interpolation [19][20][21][22][23][24][25][26][27][28][29][30][31][32][33][34][35][36].…”
Section: Introductionmentioning
confidence: 99%
“…In [11], a gas chamber and four defect models were designed. In [12], the authors describe a new dynamic model where synapses of the associative memory could be adjusted. In [13,14], the authors introduce new bidirectional hetero-associative memory models for true-colour patterns.…”
Section: Introductionmentioning
confidence: 99%