2011 IEEE Information Theory Workshop 2011
DOI: 10.1109/itw.2011.6089532
|View full text |Cite
|
Sign up to set email alerts
|

Exponential pattern retrieval capacity with non-binary associative memory

Abstract: Abstract-We consider the problem of neural association for a network of non-binary neurons. Here, the task is to recall a previously memorized pattern from its noisy version using a network of neurons whose states assume values from a finite number of non-negative integer levels. Prior works in this area consider storing a finite number of purely random patterns, and have shown that the pattern retrieval capacities (maximum number of patterns that can be memorized) scale only linearly with the number of neuron… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
35
0

Year Published

2012
2012
2021
2021

Publication Types

Select...
4
2

Relationship

3
3

Authors

Journals

citations
Cited by 10 publications
(35 citation statements)
references
References 25 publications
0
35
0
Order By: Relevance
“…In [10], we presented some preliminary results in which two efficient recall algorithms were proposed for the case where the neural graph had the structure of an expander [11]. Here, we extend the previous results to general sparse neural graphs as well as proposing a simple learning algorithm to capture the internal structure of the patterns (which will be used later in the recall phase).…”
Section: Introductionmentioning
confidence: 73%
See 3 more Smart Citations
“…In [10], we presented some preliminary results in which two efficient recall algorithms were proposed for the case where the neural graph had the structure of an expander [11]. Here, we extend the previous results to general sparse neural graphs as well as proposing a simple learning algorithm to capture the internal structure of the patterns (which will be used later in the recall phase).…”
Section: Introductionmentioning
confidence: 73%
“…Recently, the present authors introduced a novel model inspired by modern coding techniques in which a neural bipartite graph is used to memorize the patterns that belong to a subspace [10]. The proposed model can be also thought of as a way to capture higher order correlations in given patterns while keeping the computational complexity to a minimal level (since instead of O(n p−2 ) weights one needs to only keep track of O(n 2 ) of them).…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…To achieve an exponential scaling in the storage capacity of neural networks Kumar et al [5] suggested a different viewpoint in which the network is no longer required to memorize any set of random patterns but only those that have some common structure, namely, patterns all belong to a subspace with dimension k < n. Karbasi et al [6] extended this model to "modular" neural architectures and introduced a suitable online learning algorithm. They showed that the modular structure improves the noise tolerance properties significantly.…”
Section: Introductionmentioning
confidence: 99%