2011
DOI: 10.1109/tnn.2011.2146789
|View full text |Cite
|
Sign up to set email alerts
|

Sparse Neural Networks With Large Learning Diversity

Abstract: Coded recurrent neural networks with three levels of sparsity are introduced. The first level is related to the size of messages that are much smaller than the number of available neurons. The second one is provided by a particular coding rule, acting as a local constraint in the neural activity. The third one is a characteristic of the low final connection density of the network after the learning phase. Though the proposed network is very simple since it is based on binary neurons and binary connections, it … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
116
0

Year Published

2011
2011
2019
2019

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 122 publications
(117 citation statements)
references
References 20 publications
1
116
0
Order By: Relevance
“…Later, Gripon and Berrou came up with a different approach based on neural cliques, which increased the pattern retrieval capacity to O(n 2 ) [6]. Their method is based on dividing a neural network of size n into c clusters of size n/c each.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Later, Gripon and Berrou came up with a different approach based on neural cliques, which increased the pattern retrieval capacity to O(n 2 ) [6]. Their method is based on dividing a neural network of size n into c clusters of size n/c each.…”
Section: Related Workmentioning
confidence: 99%
“…Next, we present a sufficient condition such that the minimum Hamming distance 6 between these exponential number of patterns is not too small. In order to prove such a result, we will exploit the expansion properties of the bipartite graph W ; our sufficient condition will be in terms of a lower bound on the parameters of the expander graph.…”
Section: Minimum Distance Of Patternsmentioning
confidence: 99%
See 1 more Smart Citation
“…Recently, Gripon and Berrou [5], [6] introduced a novel architecture of associative memories based on the principles of modern error-correcting codes such as LDPC codes [7]. They proposed to store pieces of information in a multipartite binary graph by using cliques (a subset of fully interconnected nodes).…”
Section: Introductionmentioning
confidence: 99%
“…This is referred to as the "winner-takes-all" principle in the neuroscience literature. Gripon and Berrou [5], [6] claim to provide near-optimal efficiency, along with limited computational complexity and low error probability. The recent extension [8] proposes novel decoding rules that both enhance performance and yield convergence guarantees.…”
Section: Introductionmentioning
confidence: 99%