This study explores the patterns of gesture and speech combinations from the babbling period to the one-word stage and the temporal alignment between the two modalities.
The results refine the phonological synchronization rule (McNeill, 1992), showing that gesture apexes are anchored in intonation peaks and that gesture and prosodic movements are bound by prosodic phrasing.
Although research has shown that adults can benefit from the presence of beat gestures in word recall tasks, studies have failed to conclusively generalize these findings to preschool children. This study investigated whether the presence of beat gestures helps children to recall information when these gestures have the function of singling out a linguistic element in its discourse context. A total of 106 3- to 5-year-old children were asked to recall a list of words within a pragmatically child-relevant context (i.e., a storytelling activity) in which the target word was or was not accompanied by a beat gesture. Results showed that children recalled the target word significantly better when it was accompanied by a beat gesture than when it was not, indicating a local recall effect. Moreover, the recall of adjacent non-target words did not differ depending on the condition, revealing that beat gestures seem to have a strictly local highlighting function (i.e., no global recall effect). These results demonstrate that preschoolers benefit from the pragmatic contribution offered by beat gestures when they function as multimodal markers of prominence.
There is considerable debate about whether early vocalizations mimic the target language and whether prosody signals emergent intentional communication. A longitudinal corpus of four Catalan-babbling infants was analyzed to investigate whether children use different prosodic patterns to distinguish communicative from investigative vocalizations and to express intentionality. A total of 2,701 vocalizations from 0;7 to 0;11 were coded acoustically (by marking pitch range and duration), gesturally, and pragmatically (by marking communicative status and specific pragmatic function). The results showed that communicative vocalizations were shorter and had a wider pitch range than investigative vocalizations and that these patterns in communicative vocalizations depended on the intention of the vocalizations: requests and expressions of discontent displayed wider pitch range and longer duration than responses or statements. These results support the hypothesis that babbling children can successfully use a set of prosodic patterns to signal intentional speech.
Communicative hand gesticulations are tightly coupled to prosodic aspects of speech of speech. Psychologists have characterized this multimodal synchrony as a preplanning process governed by a cognitive timing mechanism acquired only later in development. However, it has recently been found that acoustic markers of emphatic stress arise naturally during steady-state phonation when upper-limb movements impart physical impetus on the body, most likely affecting acoustics via respiratory activity. In this confirmatory study, participants (N = 29) uttered consonant-vowel CV (/pa/) mono-syllables in rhythmic fashion while moving the upper limbs (or not). We show that respiration-related movement is affected by (especially high-impetus) gesturing when vocalizations occur near peaks in physical impetus. We further show that gesture-induced moments of bodily impulses increase the amplitude envelope of speech, while not similarly affecting the Fundamental Frequency (F0). Finally, we find tight relations between respiration-related movement and vocalization, even in the absence of movement, and even more strong respiration-acoustic relations are found when upper-limb movement is present. The current findings expand a developing line of research showing that speech acoustics is modulated by functional biomechanical linkages between hand gesture and the respiratory system.
This study examines the influence of the position of prosodic heads (accented syllables) and prosodic edges (prosodic word and intonational phrase boundaries) on the timing of head movements. Gesture movements and prosodic events tend to be temporally aligned in the discourse, the most prominent part of gestures typically being aligned with prosodically prominent syllables in speech. However, little is known about the impact of the position of intonational phrase boundaries on gesture-speech alignment patterns. Twenty-four Catalan speakers produced spontaneous (experiment 1) and semi-spontaneous head gestures with a confirmatory function (experiment 2), along with phrasefinal focused words in different prosodic conditions (stress-initial, stress-medial, and stress-final). Results showed (a) that the scope of head movements is the associated focused prosodic word, (b) that the left edge of the focused prosodic word determines where the interval of gesture prominence starts, and (c) that the speech-anchoring site for the gesture peak (or apex) depends both on the location of the accented syllable and the distance to the upcoming intonational phrase boundary. These results demonstrate that prosodic heads and edges have an impact on the timing of head movements, and therefore that prosodic structure plays a central role in the timing of co-speech gestures.
The development of body movements such as hand or head gestures, or facial expressions, seems to go hand-in-hand with the development of speech abilities. We know that very young infants rely on the movements of their caregivers' mouth to segment the speech stream, that infants' canonical babbling is temporally related to rhythmic hand movements, that narrative abilities emerge at a similar time in speech and gestures, and that children make use of both modalities to access complex pragmatic intentions. Prosody has emerged as a key linguistic component in this speech-gesture relationship, yet its exact role in the development of multimodal communication is still not well understood. For example, it is not clear what the relative weights of speech prosody and body gestures are in language acquisition, or whether both modalities develop at the same time or whether one modality needs to be in place for the other to emerge. The present paper reviews existing literature on the interactions between speech prosody and body movements from a developmental perspective in order to shed some light on these issues.
Expressive moments in communicative hand gesture often align with emphatic stress in speech. It has recently been found that acoustic markers of emphatic stress arise naturally during steady-state phonation when upper-limb movements impart physical impulse on the body, most likely affecting acoustics via respiratory activity. In this confirmatory study, participants (N = 29) repeatedly uttered consonant-vowel CV (/pa/) mono-syllables while moving in particular phase relations with speech, or not moving the upper limbs. We show that respiration-related activity is affected by (especially high-impulse) gesturing when vocalizations occur near peaks in physical impulse. We further show that gesture-induced moments of bodily impulses increase the amplitude envelope of speech, while not similarly affecting the Fundamental Frequency (F0). Finally, tight relations between respirationrelated activity and vocalization were observed, even in the absence of movement, but even more so when upper-limb movement is present. The current findings expand a developing line of research showing that speech is modulated by functional biomechanical linkages between hand gesture and the respiratory system. This identification gesture-speech biomechanics promises to provide an alternative phylogenetic, ontogenetic, and mechanistic explanatory route of why communicative upper limb movements co-occur with speech in humans.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.