2006
DOI: 10.1016/j.physa.2005.11.049
|View full text |Cite
|
Sign up to set email alerts
|

Feed-forward chains of recurrent attractor neural networks with finite dilution near saturation

Abstract: A stationary state replica analysis for a dual neural network model that interpolates between a fully recurrent symmetric attractor network and a strictly feed-forward layered network, studied by Coolen and Viana, is extended in this work to account for finite dilution of the recurrent Hebbian interactions between binary Ising units within each layer. Gradual dilution is found to suppress part of the phase transitions that arise from the competition between recurrent and feedforward operation modes of the netw… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 29 publications
0
3
0
Order By: Relevance
“…We have confirmed this conjecture by means of numerical simulations. The connectivity regime c ∝ N b (0 < b < 1) lies between sparse (c = O(1)) and diluted (c = O(N)) networks [84][85][86][87]. Even though this intermediate connectivity range, called the extremely diluted regime, has been known for a long time in the field of neural networks [95,96], it has been studied only in the case of homogeneous networks, for which degree fluctuations are unimportant.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…We have confirmed this conjecture by means of numerical simulations. The connectivity regime c ∝ N b (0 < b < 1) lies between sparse (c = O(1)) and diluted (c = O(N)) networks [84][85][86][87]. Even though this intermediate connectivity range, called the extremely diluted regime, has been known for a long time in the field of neural networks [95,96], it has been studied only in the case of homogeneous networks, for which degree fluctuations are unimportant.…”
Section: Discussionmentioning
confidence: 99%
“…Figure 7 confirms that the mean-field theory presented in this work describes spin models on heterogeneous networks where c scales as c ∝ N b , with 0 < b < 1. This regime of connectivity lies between sparse networks (b = 0) and diluted networks (b = 1) [84][85][86][87].…”
Section: Random Couplingsmentioning
confidence: 99%
“…We have confirmed this conjecture by means of numerical simulations. The connectivity c ∝ N b (0 < b < 1) lies between sparse (c = O(1)) and diluted (c = O(N )) networks [80][81][82][83]. Even though this intermediate connectivity range, called the extreme diluted regime, has been known for a long time in the field of neural networks [91,92], it has been studied only in the case of homogeneous networks, for which degree fluctuations are unimportant.…”
mentioning
confidence: 99%