2009
DOI: 10.1111/j.1467-7687.2009.00822.x
|View full text |Cite
|
Sign up to set email alerts
|

Statistical learning of phonetic categories: insights from a computational approach

Abstract: Recent evidence (Maye, Werker & Gerken, 2002) suggests that statistical learning may be an important mechanism for the acquisition of phonetic categories in the infant's native language. We examined the sufficiency of this hypothesis and its implications for development by implementing a statistical learning mechanism in a computational model based on a Mixture of Gaussians (MOG) architecture. Statistical learning alone was found to be insufficient for phonetic category learning-an additional competition mecha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

5
218
1

Year Published

2011
2011
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 190 publications
(227 citation statements)
references
References 39 publications
5
218
1
Order By: Relevance
“…Recent models have tested whether infant-directed speech indeed contains sufficiently clear peaks for such a distributional learning mechanism to succeed. Indeed, this appears to be the case for both consonants (at least for VOT contrasts, McMurray, Aslin, & Toscano, 2009) and vowels (Vallabha, McClelland, Pons, Werker, & Amano, 2007;Benders, 2013). In short, computational models of first language acquisition provide evidence that infants' input contains sufficient information to learn phonetic contrasts without requiring lexical knowledge.…”
Section: Distribution-driven Learning Of Perceptionmentioning
confidence: 95%
“…Recent models have tested whether infant-directed speech indeed contains sufficiently clear peaks for such a distributional learning mechanism to succeed. Indeed, this appears to be the case for both consonants (at least for VOT contrasts, McMurray, Aslin, & Toscano, 2009) and vowels (Vallabha, McClelland, Pons, Werker, & Amano, 2007;Benders, 2013). In short, computational models of first language acquisition provide evidence that infants' input contains sufficient information to learn phonetic contrasts without requiring lexical knowledge.…”
Section: Distribution-driven Learning Of Perceptionmentioning
confidence: 95%
“…Accounts based purely on statistical distributions of acoustic cues would have difficulty explaining the formation of separate categories in such cases where acoustic information from two speech categories is very similar. Recent evidence from computational models also suggests that acoustic distributional information alone may not be sufficient for phonetic category acquisition (Feldman, Myers, White, Griffiths, & Morgan, 2011;McMurray, Aslin, & Toscano, 2009a). Feldman et al (2011) suggest that phonetic category development occurs as part of extracting meaning from language, through association of phonetic distributions with lexical items.…”
Section: Discussionmentioning
confidence: 99%
“…Infants attend to distributional characteristics of their input (Maye et al, 2002(Maye et al, , 2008, leading to the hypothesis that phonetic categories could be acquired on the basis of bottom-up distributional learning alone (de Boer and Kuhl, 2003;Vallabha et al, 2007;McMurray et al, 2009). However, this would require sound categories to be well separated, which often is not the case-for example, see Figure 1, which shows the English vowel space that is the focus of this paper.…”
Section: Background and Overview Of Modelsmentioning
confidence: 99%
“…Following previous models of vowel learning (de Boer and Kuhl, 2003;Vallabha et al, 2007;McMurray et al, 2009;Dillon et al, 2013) we assume that vowel tokens are drawn from a Gaussian mixture model. The Infinite Gaussian Mixture Model (IGMM) (Rasmussen, 2000) includes a DP prior, as described above, in which the base distribution H C generates multivariate Gaussians drawn from a Normal Inverse-Wishart prior.…”
Section: Phonetic Categories: Igmmmentioning
confidence: 99%
See 1 more Smart Citation