2018
DOI: 10.31234/osf.io/r673v
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Perceptual dimensions influence auditory category learning

Abstract: Human category learning appears to be supported by dual learning systems. Previous research indicates the engagement of distinct neural systems in learning categories that require selective attention to dimensions versus those that require integration across dimensions. This evidence has largely come from studies of learning across perceptually separable visual dimensions, but recent research has applied dual systems models to understanding auditory and speech categorization. Since differential engagement of t… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
12
3

Year Published

2021
2021
2022
2022

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(17 citation statements)
references
References 38 publications
2
12
3
Order By: Relevance
“…After hearing nonspeech stimuli in which two auditory dimensions are perfectly correlated, listeners can discriminate between stimuli that follow the same correlation as in training, but not those that violate the correlation (Stilp et al, 2010 ; Stilp & Kluender, 2012 ), suggesting that correlations among dimensions can drive auditory perceptual space learning. The integration of perceptual dimensions for perceiving speech is not always determined by experience (Kingston et al, 2008 ; S. Lee & Katz, 2016 ), but several studies have suggested that an experience-based perceptual space learning process could play a role (Holt et al, 2001 ; Nearey, 1997 ; Schertz et al, 2020 ) and could interact in nontrivial ways with subsequent learning of cue weights (Roark et al, 2020 ; Roark & Holt, 2019 ; Scharinger et al, 2013 ).…”
Section: Empirical Evidence For Perceptual Space Learningmentioning
confidence: 99%
“…After hearing nonspeech stimuli in which two auditory dimensions are perfectly correlated, listeners can discriminate between stimuli that follow the same correlation as in training, but not those that violate the correlation (Stilp et al, 2010 ; Stilp & Kluender, 2012 ), suggesting that correlations among dimensions can drive auditory perceptual space learning. The integration of perceptual dimensions for perceiving speech is not always determined by experience (Kingston et al, 2008 ; S. Lee & Katz, 2016 ), but several studies have suggested that an experience-based perceptual space learning process could play a role (Holt et al, 2001 ; Nearey, 1997 ; Schertz et al, 2020 ) and could interact in nontrivial ways with subsequent learning of cue weights (Roark et al, 2020 ; Roark & Holt, 2019 ; Scharinger et al, 2013 ).…”
Section: Empirical Evidence For Perceptual Space Learningmentioning
confidence: 99%
“…In auditory category learning studies, listeners are trained to be able to distinguish between two or more novel auditory categories. These studies report that listeners can in principle readily learn to recognise novel auditory categories, although overall accuracy of learning depends on the specific nature of the acoustic manipulation (e.g., Goudbeek et al, 2009;Scharinger et al, 2013;Roark & Holt, 2019;Holt & Lotto, 2008;Roark & Holt, 2018;Gabay et al, 2015).…”
Section: Introductionmentioning
confidence: 99%
“…Crucially, some of these studies have also directly explored how listeners use specific acoustic cues during learning to build representations of the newly-learned categories. In these studies, listeners are usually trained on recognise novel auditory categories that either differ in one acoustic property or two co-varying acoustic properties (Goudbeek et al, 2009, Scharinger et al, 2013Roark & Holt, 2019). When relating these acoustic properties to listeners' category judgements, during and after training these studies usually show that listeners at least partially base their category judgements on the acoustic properties that do indeed distinguish between the categories.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations