The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
1999
DOI: 10.1111/1467-7687.00052
|View full text |Cite
|
Sign up to set email alerts
|

The intermodal representation of speech in newborns

Abstract: It has been proposed that speech is specified by the eye, the ear, and even the skin. Kuhl and Meltzoff (1984) showed that 4-month-olds could lip-read to an extent. Given the age of the infants, it was not clear whether this was a learned skill or a by-product of the primary auditory process. This paper presents evidence that neonate infants (less than 33 h) show virtually identical patterns of intermodal interaction as do 4-month-olds. Since they are neonates, it is unlikely that learning was involved. The re… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
43
1

Year Published

2004
2004
2023
2023

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 47 publications
(47 citation statements)
references
References 15 publications
2
43
1
Order By: Relevance
“…When presented with two side-by-side images of the same woman's face articulating "ee" and "ouu" respectively, infants of 4 months (Kuhl & Meltzoff 1982) and 2 months (Patterson & Werker 2003), and possibly even newborns (Aldridge et al 1999), look longer to the side that is articulating the sound that they hear. Infants this young can also match heard and seen consonants (MacKain et al 1983) and do so best when the matching face is on the right side, indicating involvement of the left hemisphere language areas.…”
Section: Audiovisual Matching and Integrationmentioning
confidence: 99%
“…When presented with two side-by-side images of the same woman's face articulating "ee" and "ouu" respectively, infants of 4 months (Kuhl & Meltzoff 1982) and 2 months (Patterson & Werker 2003), and possibly even newborns (Aldridge et al 1999), look longer to the side that is articulating the sound that they hear. Infants this young can also match heard and seen consonants (MacKain et al 1983) and do so best when the matching face is on the right side, indicating involvement of the left hemisphere language areas.…”
Section: Audiovisual Matching and Integrationmentioning
confidence: 99%
“…Similar to speech sounds and letters, optimized AV speech integration develops over the course of many years (McGurk & MacDonald, 1976; Ross et al, 2011; Sekiyama & Burnham, 2008). However, this process begins much earlier than reading, with some level of sensitivity to the congruency between the sounds of certain vowels and their corresponding articulations already present in infants as young as 2 months (Patterson & Werker, 2003) and even, it has been argued, in newborns (Aldridge, Braga, Walton, & Bower, 1999). We turn now to this literature, while keeping in mind that the acquisition of these multisensory associations might be considerably easier because 1) the speech sounds and mouth gestures are causally related, 2) audio-visual speech is encountered beginning in infancy and on a very regular basis, and 3) the learning of these relationships is largely implicit rather than explicit.…”
Section: Multisensory Processing Reading and Dyslexiamentioning
confidence: 99%
“…Recently, such auditory-visual vowel matching ability has been found for 2-month-old infants (Patterson & Werker, 2003), and there also is some evidence for auditory-visual matching in newborns (Aldridge, Braga, Walton, & Bower, 1999). While these studies suggest that auditory-visual matching appears early in development, both experiential and maturational influences are nevertheless evident.…”
mentioning
confidence: 94%