2008
DOI: 10.1109/tnn.2007.909528
|View full text |Cite
|
Sign up to set email alerts
|

Recurrent Correlation Associative Memories: A Feature Space Perspective

Abstract: In this paper, we analyze a model of recurrent kernel associative memory (RKAM) recently proposed by Garcia and Moreno. We show that this model consists in a kernelization of the recurrent correlation associative memory (RCAM) of Chiueh and Goodman. In particular, using an exponential kernel, we obtain a generalization of the well-known exponential correlation associative memory (ECAM), while using a polynomial kernel, we obtain a generalization of higher order Hopfield networks with Hebbian weights. We show t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
23
0

Year Published

2009
2009
2020
2020

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 21 publications
(23 citation statements)
references
References 38 publications
0
23
0
Order By: Relevance
“…In contrast to the original Hopfield neural network which has a limited storage capacity, some RCNN models can reach the storage capacity of an ideal associative memory [43]. Furthermore, certain RCNNs can be viewed as kernelized versions of the Hopfield network with Hebbian learning [16,17,18]. Finally, RCNNs are closely related to the dense associative memory model introduced by Krotov and Hopfield to establish the duality between associative memories and deep learning [44,45].…”
Section: Quaternion-valued Recurrent Correlation Neural Networkmentioning
confidence: 99%
“…In contrast to the original Hopfield neural network which has a limited storage capacity, some RCNN models can reach the storage capacity of an ideal associative memory [43]. Furthermore, certain RCNNs can be viewed as kernelized versions of the Hopfield network with Hebbian learning [16,17,18]. Finally, RCNNs are closely related to the dense associative memory model introduced by Krotov and Hopfield to establish the duality between associative memories and deep learning [44,45].…”
Section: Quaternion-valued Recurrent Correlation Neural Networkmentioning
confidence: 99%
“…Dynamical Neural Networks (NN) are powerful methods which are applied in system identification and modeling nonlinear dynamical systems (Nørgaard, 2000;Janczak, 2005;Liu, 2001), classification of time series (Ao, 2010;Hu & Hwang, 2010) and pattern recognition using memories (Zurada, 1992;Perfetti & Ricci, 2008;Wang et al, 1990;Shen & Cruz, 2005;Chartier & Boukadoum, 2006;Sudo et al, 2009). Within neurobiological aspect, these methods are important to emulate and explain different biological behaviors includes information storage and recall (Zeng and Zheng, 2012).…”
Section: Introductionmentioning
confidence: 99%
“…Moreover, these associative memories are robust against input noise and have a practically constant retrieval time independent of the number of stored associations. These memories have been intensively studied in the past (e.g., Kohonen [1977], Palm [1980Palm [ , 2013, and Willshaw [1971]) with many successful applications (e.g., Sudo et al [2009], Perfetti and Ricci [2008], and Annovi et al [2013]). …”
Section: Introductionmentioning
confidence: 99%