2022
DOI: 10.1016/j.cogpsych.2022.101508
|View full text |Cite
|
Sign up to set email alerts
|

The quest for simplicity in human learning: Identifying the constraints on attention

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

2
22
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5

Relationship

3
2

Authors

Journals

citations
Cited by 7 publications
(24 citation statements)
references
References 97 publications
2
22
0
Order By: Relevance
“…By contrast, the right panel of Figure 1D shows that when the learner tries to minimize the total amount of attention, both dimensions are still attended, but neither dimension is attended as much as they would be if the learner only maximized accuracy. In Galdo et al (2022), we performed a "switchboard" analysis (e.g., Turner, 2019;Turner et al, 2018) on four data sets collected by Blair and colleagues (McColeman et al, 2014) and one data set from Mack et al (2016). We found compelling evidence that humans used multiple goals during learning, especially the ones depicted in Figure 1, and the degree to which those goals were prioritized differed across individual participants.…”
Section: Goals 2 and 3: Reducing Attentionmentioning
confidence: 89%
See 4 more Smart Citations
“…By contrast, the right panel of Figure 1D shows that when the learner tries to minimize the total amount of attention, both dimensions are still attended, but neither dimension is attended as much as they would be if the learner only maximized accuracy. In Galdo et al (2022), we performed a "switchboard" analysis (e.g., Turner, 2019;Turner et al, 2018) on four data sets collected by Blair and colleagues (McColeman et al, 2014) and one data set from Mack et al (2016). We found compelling evidence that humans used multiple goals during learning, especially the ones depicted in Figure 1, and the degree to which those goals were prioritized differed across individual participants.…”
Section: Goals 2 and 3: Reducing Attentionmentioning
confidence: 89%
“…One computational model we discuss here is the adaptive attention representation model (AARM; Galdo et al, 2022;Turner, 2019;Weichart, Evans, et al, 2022;, which is derived from other exemplar-based models of categorization (Estes, 1986;Medin & Schaffer, 1978;Nosofsky, 1986). In AARM (as well as other models in this class), attention is represented as a vector containing the amount of attention for each dimension of information.…”
Section: Optimizing For a Learner's Goalsmentioning
confidence: 99%
See 3 more Smart Citations