Background: Two main conflicting positions exist concerning the relationship of gesture and speech. The first claims that gesture and speech constitute a single bimodal production process that leads to an impairment of both communication channels in the case of aphasia. The second accounts for two independent but tightly coordinated processes with a trade-off relationship of gesture and speech. According to the latter assumption, speakers who have aphasia should be able to compensate for their verbal deficiencies through gesture. Studies provide evidence for both accounts. Furthermore, non-verbal capacities like semantic processing or manual praxis have been shown to influence the gestural performance of speakers who have aphasia. Aims: The first aim of the current study was to clarify the relationship between gesture and speech production in aphasia by exploring how much information speakers with aphasia conveyed by gesture versus speech in narrations. The second aim was to evaluate if these speakers make use of their full communicative potential through gesture. We compared gesture use in a verbal narration with gesture use in a silent condition, where participants were not allowed to speak. Furthermore, the influence of language and non-verbal cognitive capacities on gesture was examined. Methods & Procedures:Sixteen participants with varying degrees of aphasia severity retold short video clips in a verbal and a silent condition. In the latter, participants were asked to retell the stories by exclusively using gestures. Subsequently, healthy speakers judged the comprehensibility of gestures and of the spoken expression in the verbal narration in a forced-choice recognition task. Comprehensibility scores were compared between conditions and influencing linguistic and non-verbal factors were evaluated. Outcomes & Results: In the verbal condition, two participants conveyed more information through gesture than through spoken expression. Furthermore, half of the participants could augment gestural comprehensibility in the silent condition. Communicative efficiency of gesture was predicted by the pantomime-to-command test.
For over a century, pantomime of tool use has been employed to diagnose limb apraxia, a disorder of motor cognition primarily induced by left brain damage. While research consistently implicates damage to a left fronto-temporo-parietal network in limb apraxia, findings are inconsistent regarding the impact of damage to anterior versus posterior nodes within this network on pantomime. Complicating matters is the fact that tool use pantomime can be affected and evaluated at multiple levels. For instance, the production of tool use gestures requires the consideration of semantic characteristics (e.g. how to communicate the action intention) as well as motor features (e.g. forming grip and movement). Together, these factors may contribute substantially to apparent discrepancies in previously reported findings regarding neural correlates of tool use pantomime.In the current study, 67 stroke patients with unilateral left-brain damage performed a classic pantomime task. In order to analyze different error characteristics, we evaluated the proper use of grip and movement for each pantomime. For certain objects, healthy subjects may use body parts as representative for the object, e.g. use of the fingers to indicate scissors blades. To specify the pathological use of body parts as the object (BPO) we only assessed pantomime items that were not prone to this response in healthy participants. We performed modern voxel-based lesion analyses on MRI or CT data to determine associations between brain injury and the frequency of the specific types of pantomime errors.Our results support a model in which anterior and posterior nodes of the left fronto-temporo-parietal network contribute differentially to pantomime of tool use. More precisely, damage in the inferior frontal cortex reaching to the temporal pole is associated with an increased frequency of BPO errors, whereas damage to the inferior parietal lobe is predominantly linked to an increased frequency of movement and/or grip errors. Our work suggests that the validity of attempts to specify the neural correlates of limb apraxia based on tool use pantomime depends on differentiating the specific types of errors committed. We conclude that successful tool use pantomime involves dissociable functions with communicative aspects represented in more anterior (rather ventral) regions and motor-cognitive aspects in more posterior (rather dorsal) nodes of a left fronto-temporo-parietal network.
It was concluded that all 3 gesture types under investigation contributed to the expression of semantic meaning communicated by PWA. Gestures are an important communicative means for PWA and should be regarded as such by their interlocutors. Gestures have been shown to enhance listeners' interpretation of PWA's overall communication.
Purpose People with aphasia (PWA) use different kinds of gesture spontaneously when they communicate. Although there is evidence that the nature of the communicative task influences the linguistic performance of PWA, so far little is known about the influence of the communicative task on the production of gestures by PWA. We aimed to investigate the influence of varying communicative constraints on the production of gesture and spoken expression by PWA in comparison to persons without language impairment. Method Twenty-six PWA with varying aphasia severities and 26 control participants (CP) without language impairment participated in the study. Spoken expression and gesture production were investigated in 2 different tasks: (a) spontaneous conversation about topics of daily living and (b) a cartoon narration task, that is, retellings of short cartoon clips. The frequencies of words and gestures as well as of different gesture types produced by the participants were analyzed and tested for potential effects of group and task. Results Main results for task effects revealed that PWA and CP used more iconic gestures and pantomimes in the cartoon narration task than in spontaneous conversation. Metaphoric gestures, deictic gestures, number gestures, and emblems were more frequently used in spontaneous conversation than in cartoon narrations by both participant groups. Group effects show that, in both tasks, PWA's gesture-to-word ratios were higher than those for the CP. Furthermore, PWA produced more interactive gestures than the CP in both tasks, as well as more number gestures and pantomimes in spontaneous conversation. Conclusions The current results suggest that PWA use gestures to compensate for their verbal limitations under varying communicative constraints. The properties of the communicative task influence the use of different gesture types in people with and without aphasia. Thus, the influence of communicative constraints needs to be considered when assessing PWA's multimodal communicative abilities.
People with aphasia use gestures not only to communicate relevant content but also to compensate for their verbal limitations. The Sketch Model (De Ruiter, 2000) assumes a flexible relationship between gesture and speech with the possibility of a compensatory use of the two modalities. In the successor of the Sketch Model, the AR‐Sketch Model (De Ruiter, 2017), the relationship between iconic gestures and speech is no longer assumed to be flexible and compensatory, but instead iconic gestures are assumed to express information that is redundant to speech. In this study, we evaluated the contradictory predictions of the Sketch Model and the AR‐Sketch Model using data collected from people with aphasia as well as a group of people without language impairment. We only found compensatory use of gesture in the people with aphasia, whereas the people without language impairments made very little compensatory use of gestures. Hence, the people with aphasia gestured according to the prediction of the Sketch Model, whereas the people without language impairment did not. We conclude that aphasia fundamentally changes the relationship of gesture and speech.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.