2020
DOI: 10.1101/2020.07.01.180372
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

H-Mem: Harnessing synaptic plasticity with Hebbian Memory Networks

Abstract: AbstractThe ability to base current computations on memories from the past is critical for many cognitive tasks such as story understanding. Hebbian-type synaptic plasticity is believed to underlie the retention of memories over medium and long time scales in the brain. However, it is unclear how such plasticity processes are integrated with computations in cortical networks. Here, we propose Hebbian Memory Networks (H-Mems), a simple neural network model tha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
2
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(18 citation statements)
references
References 25 publications
0
15
0
Order By: Relevance
“… Abbreviations: SMA, sequential multi‐head attention model; MemNN, memory network; WSH, weakly supervised heuristic; H‐mem, Hebbian memory networks (Limbacher & Legenstein, 2020); DMN, dynamic memory network (Xiong et al, 2016); SAM, self‐attentive associative memory (Le et al, 2020). …”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“… Abbreviations: SMA, sequential multi‐head attention model; MemNN, memory network; WSH, weakly supervised heuristic; H‐mem, Hebbian memory networks (Limbacher & Legenstein, 2020); DMN, dynamic memory network (Xiong et al, 2016); SAM, self‐attentive associative memory (Le et al, 2020). …”
Section: Resultsmentioning
confidence: 99%
“…However, we noticed a significant performance gap, when our results were compared to other recent memory‐based neural networks (Table 2). Most recent models utilize some form of memory mechanism which enables them to create long term dependency (Le et al, 2020; Limbacher & Legenstein, 2020; Xiong et al, 2016). For example, the state of the art model SAM (Le et al, 2020), keeps and dynamically updates both item and relational memory which enables the model to keep long term relational and item dependency.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…From a machine learning perspective, this architecture can be motivated from memory-augmented neural networks, a class of ANN models that were shown to outperform standard ANNs in memory-dependent computational tasks. This class includes networks with key-value memory systems such as memory-networks [17], [18], Hebbian memory networks [19], and transformers [20]. The latter have been shown to be particularly powerful for language-processing, giving rise to language models such as GPT-3 [21].…”
Section: Spiking Neural Network With Associative Memorymentioning
confidence: 99%
“…In Table 1 we compare our model to the Spiking RelNet [35] and to the H-Mem model [19], a non-spiking memory network model. Similar to the feedback-loop from the value-layer to the input of the key-layer in our model, the H-Mem model can utilize several memory accesses conditioned on previous memory recalls.…”
Section: Question Answeringmentioning
confidence: 99%