2017
DOI: 10.1101/098939
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Chunking as a rational strategy for lossy data compression in visual working memory

Abstract: 1 2The nature of capacity limits for visual working memory has been the 3 subject of an intense debate that has relied on models that assume items are 4 encoded independently. Here we propose that instead, similar features are jointly 5 encoded through a "chunking" process to optimize performance on visual working 6 memory tasks. We show that such chunking can: 1) facilitate performance 7 improvements for abstract capacity-limited systems, 2) be optimized through 8 reinforcement, 3) be implemented by center-su… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
16
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 18 publications
(17 citation statements)
references
References 85 publications
1
16
0
Order By: Relevance
“…In contrast, BLO models a common mechanism underlying all probability distortion, where we identified two constraints-boundedness and compensation for representational uncertainty (variance)-that are pervasive in models of cognitive and perceptual tasks 37,44,66,67 . We found that BLO can be used to estimate an individual's probability distortion in one task and to some extent predict the same individual's performance in another task.…”
Section: Discussionmentioning
confidence: 99%
“…In contrast, BLO models a common mechanism underlying all probability distortion, where we identified two constraints-boundedness and compensation for representational uncertainty (variance)-that are pervasive in models of cognitive and perceptual tasks 37,44,66,67 . We found that BLO can be used to estimate an individual's probability distortion in one task and to some extent predict the same individual's performance in another task.…”
Section: Discussionmentioning
confidence: 99%
“…Specifically, we assume an encoding process Pr(ỹ|y) creating the corrupted (encoded) memory traceỹ i = (x ′ ,ẽ ′ ,ñ) where ...,x d ],ẽ ′ andñ corresponds to the corrupted memory traces of the scene features, event label, and time index, respectively. The assumption that memory traces are corrupted versions of an original stimulus is common in computational models of memory (Hemmer & Steyvers, 2009;Huttenlocher et al, 1991;Shiffrin & Steyvers, 1997) and is analogous to a capacity-limited compression (Brady, Konkle, & Alvarez, 2009;Nassar, Helmers, & Frank, 2018).…”
Section: Event Memorymentioning
confidence: 99%
“…To implement a policy computationally, we would need to describe it in some programming language, and the description length of that program (e.g., in bits or nats) imposes a demand on memory resources. Intuitively, if a policy can be "compressed" to a short description length, it will be easier to remember, much in the same way that the benefits of compression have been studied in memory for digit verbal and visual stimuli (Brady et al, 2009;Mathy and Feldman, 2012;Nassar et al, 2018). As we will formalize later, it turns out that perseveration arises naturally from the imperative to reduce policy complexity.…”
Section: Introductionmentioning
confidence: 99%