2022
DOI: 10.1101/2022.07.05.498835
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The curse of optimism: a persistent distraction by novelty

Abstract: Human curiosity has been interpreted as a drive for exploration and modeled by intrinsically motivated reinforcement learning algorithms. An unresolved challenge in machine learning is that these algorithms are prone to distraction by reward-independent stochastic stimuli. We ask whether humans exhibit the same distraction pattern in their behavior as the algorithms. To answer this question, we design a multi-step decision-making paradigm containing an unknown number of states in a stochastic part of the envir… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 109 publications
(172 reference statements)
0
4
0
Order By: Relevance
“…We developed a generalized model of novelty computation that leverages kernel mixture models to capture the effect of stimulus similarities on novelty computation and extends current computational novelty models to continuous state spaces. In tasks where all stimuli are discrete and distinct, our kernel-based novelty model is equivalent to count-based novelty [14,15,[18][19][20][21][22][23]46]. We show that kernel-based novelty captures novelty responses in mouse V1 [39] that are unexplained by count-based models.…”
Section: Discussionmentioning
confidence: 89%
See 3 more Smart Citations
“…We developed a generalized model of novelty computation that leverages kernel mixture models to capture the effect of stimulus similarities on novelty computation and extends current computational novelty models to continuous state spaces. In tasks where all stimuli are discrete and distinct, our kernel-based novelty model is equivalent to count-based novelty [14,15,[18][19][20][21][22][23]46]. We show that kernel-based novelty captures novelty responses in mouse V1 [39] that are unexplained by count-based models.…”
Section: Discussionmentioning
confidence: 89%
“…Most algorithmic models of novelty computation in the brain are based on estimating the counts [14, 18, 20, 22, 23] or the count-based frequency [15, 19, 21] at which stimuli (e.g., sensory stimuli, spatial locations etc.) have been observed: the more often or frequently a stimulus is observed, the less novel it is.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations