2019
DOI: 10.3390/e21080726
|View full text |Cite
|
Sign up to set email alerts
|

Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks

Abstract: In a neural network, an autapse is a particular kind of synapse that links a neuron onto itself. Autapses are almost always not allowed neither in artificial nor in biological neural networks. Moreover, redundant or similar stored states tend to interact destructively. This paper shows how autapses together with stable state redundancy can improve the storage capacity of a recurrent neural network. Recent research shows how, in an N-node Hopfield neural network with autapses, the number of stored patterns (P) … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
22
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 26 publications
(24 citation statements)
references
References 23 publications
1
22
0
Order By: Relevance
“…the components of state vectors are {+1 or -1}. The associated convergence theorem ensures that in the serial mode of operation, the initial state converges to a stable state and in the fully parallel mode of operation [42]. In essence, the stable states are realized as the "memory states" of the associative memory.…”
Section: Discrete Hopfield Neural Networkmentioning
confidence: 99%
See 2 more Smart Citations
“…the components of state vectors are {+1 or -1}. The associated convergence theorem ensures that in the serial mode of operation, the initial state converges to a stable state and in the fully parallel mode of operation [42]. In essence, the stable states are realized as the "memory states" of the associative memory.…”
Section: Discrete Hopfield Neural Networkmentioning
confidence: 99%
“…Where jk U is the weight matrix going between j and k neurons, j S defines the unit condition k and j  described the threshold function of neurons j. Several studies [28,[42][43][44][45][46][47][48] defined 0 j  = to verify that the HNN always leads to a decrease in energy monotonically. Each time neuron was connected with jk U ,the value of the connection will be preserved as a stored pattern in an interconnected vector where [36,60] that the constraint of synaptic weight matrix (1) U and does not allow self-loop neuron connection…”
Section: Discrete Hopfield Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…The benefit of metaheuristics algorithms in HNN is that a network can be inferred at different stages such as weight training and adjustment, system adaptation for determining the number of layers, node transfer functions, retrieval phase and learning rules. MEA has emerged a new evolutionary metaheuristics algorithm that has been applied in finding an optimal solution to computational optimization and engineering application [28][29][30][31]. Unlike many other metaheuristics that are mainly inspired by swam or natural evolutionary process.…”
Section: Introductionsmentioning
confidence: 99%
“…The limitation in (b) comes from the fact that, once a pattern is recognized, the system stays in that state, even if the input pattern had acted only in the beginning of the process; this feature prevents quick reactions to external changes. Improvements on the Hopfield model, like turning the coupling constants non-symmetric (in which case there is no Hamiltonian) [14,15,16], correlating or coupling memories [17,18,19,20], introducing dilution [16,21,22,23,24,25] or autapses [26], attempt to satisfy, at least partially, the drawbacks (a) and (b) above; in what follows, we propose a model to overcome both difficulties.…”
Section: Introductionmentioning
confidence: 99%