2018
DOI: 10.31234/osf.io/bjh68
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Do additional features help or hurt category learning? The curse of dimensionality in human learners

Abstract: The curse of dimensionality, which has been widely studied in statistics and machine learning, occurs when additional features causes the size of the feature space to grow so quickly that learning classification rules becomes increasingly difficult. How do people overcome the curse of dimensionality when acquiring real-world categories that have many different features? Here we investigate the possibility that the structure of categories can help. We show that when categories follow a family resemblance struct… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
3
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 34 publications
0
3
0
Order By: Relevance
“…While it is plausible that decision-makers encode all relevant stimulus information from the low-dimensional stimuli typically considered in the laboratory, 1 in high-dimensional environments, encoding all available sensory information is inefficient, and can impair learning. This reflects a fundamental computational constraint (known as the curse of dimensionality), which affects both machine-learning algorithms (Hastie, Tibshirani, & Friedman, 2009;Li et al, 2017) and human decision-makers (e.g., Bulgarella & Archer, 1962;Edgell et al, 1996;Pishkin, Bourne, & Fishkin, 1974;Vong et al, 2018).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…While it is plausible that decision-makers encode all relevant stimulus information from the low-dimensional stimuli typically considered in the laboratory, 1 in high-dimensional environments, encoding all available sensory information is inefficient, and can impair learning. This reflects a fundamental computational constraint (known as the curse of dimensionality), which affects both machine-learning algorithms (Hastie, Tibshirani, & Friedman, 2009;Li et al, 2017) and human decision-makers (e.g., Bulgarella & Archer, 1962;Edgell et al, 1996;Pishkin, Bourne, & Fishkin, 1974;Vong et al, 2018).…”
Section: Introductionmentioning
confidence: 99%
“…In the category-learning literature, stimuli with two-four features are common (e.g.,Nosofsky, 1986;Shepard et al, 1961), and stimuli with 16 features are considered to be high-dimensional (e.g.,Vong, Hendrickson, Navarro, & Perfors, 2018).…”
mentioning
confidence: 99%
“…Overfitting and, even more, brute force memorization should exclude generalization by definition, even as concerns human beings. For instance, the concepts of capacity ( [24], [25], [26], [27], [28], [29], bias ( [30], [31]), overfitting ( [32], [33]), and generalization ( [34], [35]) have been widely explored in cognitive psychology as well.…”
mentioning
confidence: 99%