In this paper, we advance a comprehensive gesture labelling proposal which highlights the independence of the prosodic and semantic properties of different gesture types and at the same time challenges a simplistic definition of beat gestures as biphasic rhythmic non-meaningful gestures (e.g., [1][2]). Following McNeill's [3] original proposal on gesture dimensions, we defend that all gesture types can associate with prosodic prominence, and even though beat gestures typically display this rhythmic behavior, this is also the case with other representational and pointing gestures too. Second, with respect to meaning, while beat gestures do not represent referential nor metaphoric content, they can serve a range of meaningful pragmatic and discursive functions in speech, which deserve to be further investigated. From a practical point of view, we propose that all non-referential gestures be initially classified as forms of beat gestures with a set of associated properties related to gesture form, prosodic form and pragmatic form. This gesture labelling proposal independently codes for (a) the form of gestures, (b) their properties of temporal association with prosodic prominence, and (c) their pragmatic meaning. We claim that this move allows for a more complete analysis of gestures in large-scale studies and opens the way for more comprehensive assessments of the interaction between gesture forms, prosodic forms, and semantic forms using labelled corpora.
Purpose Previous studies have investigated the effects of the inability to produce hand gestures on speakers' prosodic features of speech; however, the potential effects of encouraging speakers to gesture have received less attention, especially in naturalistic settings. This study aims at investigating the effects of encouraging the production of hand gestures on the following speech correlates: speech discourse length (number of words and discourse length in seconds), disfluencies (filled pauses, self-corrections, repetitions, insertions, interruptions, speech rate), and prosodic properties (measures of fundamental frequency [F0] and intensity). Method Twenty native Italian speakers took part in a narration task in which they had to describe the content of short comic strips to a confederate listener in 1 of the following 2 conditions: (a) nonencouraging condition (N), that is, no instructions about gesturing were given, and (b) encouraging condition (E), that is, the participants were instructed to gesture while telling the story. Results Instructing speakers to gesture led effectively to higher gesture rate and salience. Significant differences were found for (a) discourse length (e.g., the narratives had more words in E than in N) and (b) acoustic measures (F0 maximum, maximum intensity, and mean intensity metrics were higher in E than in N). Conclusion The study shows that asking speakers to use their hands while describing a story can have an effect on narration length and can also impact on F0 and intensity metrics. By showing that enhancing the gesture stream could affect speech prosody, this study provides further evidence that gestures and prosody interact in the process of speech production.
Previous studies have investigated the effects of the inability to make hand gestures on speakers' fluency; however, the question of whether encouraging speakers to gesture affects their fluency has received little attention. This study investigates the effect of restraining (Experiment 1) and encouraging (Experiment 2) hand gestures on the following correlates of speech: speech discourse length (number of words and discourse length in seconds), disfluencies (filled pauses, self-corrections, repetitions, insertions, interruptions, silent pauses), and acoustic properties (speech rate, measures of intensity and pitch). In two experiments, 10 native speakers of Italian took part in a narration task where they were asked to describe comic strips. Each experiment compared two conditions. In Experiment 1, subjects first received no instructions as to how to behave when narrating. Then they were told to sit on their hands while speaking. In Experiment 2, subjects first received no instructions and were then actively encouraged to use hand gestures. The results showed that restraining gestures leads to quieter and slower paced speech, while encouraging gestures triggers longer speech discourse, faster speech rate and more fluent and louder speech. Thus, both restraining and encouraging hand gestures seem to clearly affect prosodic properties of speech, particularly speech fluency.
Gesture and multimodal communication researchers typically annotate video data manually, even though this can be a very time-consuming task. In the present work, a method to detect gestures is proposed as a fundamental step towards a semi-automatic gesture annotation tool. The proposed method can be applied to RGB videos and requires annotations of part of a video as input. The technique deploys a pose estimation method and active learning. In the experiment, it is shown that if about 27% of the video is annotated, the remaining parts of the video can be annotated automatically with an F-score of at least 0.85. Users can run this tool with a small number of annotations first. If the predicted annotations for the remainder of the video are not satisfactory, users can add further annotations and run the tool again. The code has been released so that other researchers and practitioners can use the results of this research. This tool has been confirmed to work in conjunction with ELAN.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.