1999
DOI: 10.1007/s004220050566
|View full text |Cite
|
Sign up to set email alerts
|

Error tolerant associative memory

Abstract: We present a new approach to enlarging the basin of attraction of associative memory, including auto-associative memory and temporal associative memory. The memory trained by means of this method can tolerate and recover from seriously noisy patterns. Simulations show that this approach will greatly reduce the number of limit cycles.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
12
0

Year Published

2003
2003
2021
2021

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 14 publications
(12 citation statements)
references
References 27 publications
0
12
0
Order By: Relevance
“…For example, the pseudo-inverse rule (also called projection rule) can increase HNNs storage capacity and improve accuracy (Wu et al, 2012;Sahoo et al, 2016), but is not local nor incremental. Moreover, recent works have explored learning rules with self-feedback connections (non-0 diagonal), and have shown higher accuracy for a high number of stored patterns (Liou and Yuan, 1999;Folli et al, 2017;Rocchi et al, 2017;Gosti et al, 2019). In summary, despite the present limitations of the ONN, features in terms of FPS, computation time and training, are encouraging toward the exploration of a wider range of applications.…”
Section: Limitations and Future Directionsmentioning
confidence: 99%
“…For example, the pseudo-inverse rule (also called projection rule) can increase HNNs storage capacity and improve accuracy (Wu et al, 2012;Sahoo et al, 2016), but is not local nor incremental. Moreover, recent works have explored learning rules with self-feedback connections (non-0 diagonal), and have shown higher accuracy for a high number of stored patterns (Liou and Yuan, 1999;Folli et al, 2017;Rocchi et al, 2017;Gosti et al, 2019). In summary, despite the present limitations of the ONN, features in terms of FPS, computation time and training, are encouraging toward the exploration of a wider range of applications.…”
Section: Limitations and Future Directionsmentioning
confidence: 99%
“…Several kinds of AM have been implemented. Among these are Bidirectional AM (BAM) [9], Morphological AM (MAM) [10], multi associative neural network (MANN) [11], error Tolerant Associative Memory (eTAM) [12], hetero-AM [13], Theory of Distributed Associative Memory (TODAM) [14] and Recurrent Correlation Associative Memory (RCAM) [15].…”
Section: Introductionmentioning
confidence: 99%
“…TODAM uses a vector type memory which has the risk of suffering from the crosstalk problem [16]. Determining weights to produce correct output results in BAM [9], MANN [11], and eTAM [12] is time consuming . The kernel image that has to be used in the recall process in MAM [10] and RCAM [15] is difficult to built.…”
Section: Introductionmentioning
confidence: 99%
“…There is a perfect design for the hidden neurons in (Liou and Sou, 2003). The design in (Liou and Yuan, 1999) provides a biologically plausible solution for the tolerance ability without any hidden neuron and annealing process. This work will present a method to further explore the idea behind this design and formulate it for finite memory loading.…”
Section: Introductionmentioning
confidence: 99%
“…The goal is somewhat similar to that of Gardner for extensive memory loading in (Gardner, 1989). This work will use the same experimental simulations as those in (Liou and Yuan, 1999) to ease comparison.…”
Section: Introductionmentioning
confidence: 99%