2023
DOI: 10.48550/arxiv.2302.04481
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Hopfield-like model with complementary encodings of memories

Abstract: We present a Hopfield-like autoassociative network that stores each memory as two different activity patterns with complementary properties. The first encoding is dense and mutually correlated with a subset of other dense encodings, such that each memory represents an example of a concept. The second encoding is sparse and exhibits no correlation among examples. The network stores each memory as a linear combination of its encodings, which allows for sparse and dense encodings to be retrieved at high and low a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
5
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(6 citation statements)
references
References 18 publications
1
5
0
Order By: Relevance
“…S2F, G). Using techniques from statistical physics, we can calculate the capacity for each type of pattern, and these theoretical results agree with our simulation (Kang and Toyoizumi, 2023).…”
Section: Resultssupporting
confidence: 79%
See 4 more Smart Citations
“…S2F, G). Using techniques from statistical physics, we can calculate the capacity for each type of pattern, and these theoretical results agree with our simulation (Kang and Toyoizumi, 2023).…”
Section: Resultssupporting
confidence: 79%
“…We demonstrate that both types of representations can be stored and retrieved in the same network, using a threshold to select between them. This capability can be given solid theoretical underpinnings using techniques from statistical mechanics (Kang and Toyoizumi, 2023). The convergence of MF and PP pathways in CA3 has also been the subject of previous computational investigations (Treves and Rolls, 1992; McClelland and Goddard, 1996; Kaifosh and Losonczy, 2016).…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations