2021
DOI: 10.31234/osf.io/p5sj7
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Distributed neural systems enable flexible attention updating during category learning

Abstract: In order to accurately categorize novel items, humans learn to selectively attend to stimulus dimensions that are most relevant to the task. Models of category learning describe the interconnected cognitive processes that contribute to selective attention as observations of stimuli and category feedback are progressively acquired. The Adaptive Attention Representation Model (AARM), for example, provides an account whereby categorization decisions are based on the perceptual similarity of a new stimulus to stor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 98 publications
(163 reference statements)
0
1
0
Order By: Relevance
“…Given that the context model estimates salience with considerably less constraint than the fixation-informed model, it is potentially the case that additional weighting of information occurs during the decision process. Indeed, our previous work has suggested that observers adapt to environmental changes by continuously updating how different sources of information impact their decisions (Galdo et al, 2021;Weichart, Evans, et al, 2021).…”
Section: Discussionmentioning
confidence: 99%
“…Given that the context model estimates salience with considerably less constraint than the fixation-informed model, it is potentially the case that additional weighting of information occurs during the decision process. Indeed, our previous work has suggested that observers adapt to environmental changes by continuously updating how different sources of information impact their decisions (Galdo et al, 2021;Weichart, Evans, et al, 2021).…”
Section: Discussionmentioning
confidence: 99%