2019
DOI: 10.3389/fnhum.2019.00344
|View full text |Cite
|
Sign up to set email alerts
|

Auditory and Somatosensory Interaction in Speech Perception in Children and Adults

Abstract: Multisensory integration (MSI) allows us to link sensory cues from multiple sources and plays a crucial role in speech development. However, it is not clear whether humans have an innate ability or whether repeated sensory input while the brain is maturing leads to efficient integration of sensory information in speech. We investigated the integration of auditory and somatosensory information in speech processing in a bimodal perceptual task in 15 young adults (age 19–30) and 14 children (age 5–6). The partici… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

5
22
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

3
5

Authors

Journals

citations
Cited by 19 publications
(27 citation statements)
references
References 86 publications
5
22
0
Order By: Relevance
“…While speech perception is often considered to be based on the acoustic properties or auditory objects of the signal (Bizley and Cohen 2013;Diehl and Kluender 1989), alternative findings suggest that perception can be modulated by external (sensory) input that codes motor (or sensorimotor) information to the listener (Gick and Derrick 2009;Ito et al 2009;Ogane et al 2020;Sams et al 2005;Sato et al 2013). For example, air puffs to the cheek of a perceiver that coincide with auditory speech stimuli alter participants' perceptual judgements (Gick and Derrick 2009), while orofacial skin stimulation changes the auditory perceptual discrimination of speech (Ito et al 2009;Ogane et al 2020;Trudeau-Fisette et al 2019). These observations are consistent with the Motor Theory of Speech Perception (Liberman and Mattingly 1985) and the Direct Realist account suggests that perceiving speech is perceiving the vocal tract gestures (cf.…”
Section: Our Perceptions Are Based On What We Know and What We Know Is Dependent On Whatmentioning
confidence: 99%
See 1 more Smart Citation
“…While speech perception is often considered to be based on the acoustic properties or auditory objects of the signal (Bizley and Cohen 2013;Diehl and Kluender 1989), alternative findings suggest that perception can be modulated by external (sensory) input that codes motor (or sensorimotor) information to the listener (Gick and Derrick 2009;Ito et al 2009;Ogane et al 2020;Sams et al 2005;Sato et al 2013). For example, air puffs to the cheek of a perceiver that coincide with auditory speech stimuli alter participants' perceptual judgements (Gick and Derrick 2009), while orofacial skin stimulation changes the auditory perceptual discrimination of speech (Ito et al 2009;Ogane et al 2020;Trudeau-Fisette et al 2019). These observations are consistent with the Motor Theory of Speech Perception (Liberman and Mattingly 1985) and the Direct Realist account suggests that perceiving speech is perceiving the vocal tract gestures (cf.…”
Section: Our Perceptions Are Based On What We Know and What We Know Is Dependent On Whatmentioning
confidence: 99%
“…We used movement related stimulation to the listener through orofacial skin stretch to evaluate the neural response to somatosensory stimulation of the facial skin through an analysis of the change in electroencephalographic (EEG) activity. Orofacial somatosensory input associated with facial skin deformation provides motion information for speech production (Connor and Abbs 1998;Ito and Gomi 2007;Ito and Ostry 2010;Johansson et al 1988), and has been shown to interact in motion-specific ways to influence speech perception (Ito et al 2009;Ogane et al 2020;Trudeau-Fisette et al 2019). The stimulation associated with facial skin deformation also changes cortical potentials for auditory speech perception (Ito et al 2013(Ito et al , 2014, but the stimulation of lip tapping does not (Möttönen et al 2005).…”
Section: Our Perceptions Are Based On What We Know and What We Know Is Dependent On Whatmentioning
confidence: 99%
“…A body of research supports the facilitatory role of cooccurring mouthing and vocalization in the development of both language perception and production. 22,[69][70][71][72][73][74] For example, Fagan and Iverson 71 identified changes in articulatory postures arising from the co-occurrence of vocalizations with mouthing, as based on 40 TD infants, aged 6-9 months. They reported that the babbling of infants that occurred concurrently with mouthing contained a greater variety of supraglottal consonants (e.g.,/b/,/d/,/g/) than non-mouthing vocalizations.…”
Section: The Vocal Behaviors Of Children At-risk Of Cpmentioning
confidence: 99%
“…This study is part of a larger project investigating the development of such perceptual processes in school-aged children. In a recent paper (Trudeau-Fisette et al, 2019), we investigated a specific case of multisensory processing, namely, the interaction between auditory and somatosensory input during vowel perception in children and adults. More specifically, 10 synthesized vowels equally stepped on a continuum from /e/ (as in "fée" fairy) to /ø/ (as in "feu" fire) were presented in the auditory modality to francophone adults and children ranging in age from 4 to 6 years old.…”
Section: Introductionmentioning
confidence: 99%