During face-to-face communication, body orientation and coverbal gestures influence how information is conveyed. The neural pathways underpinning the comprehension of such nonverbal social cues in everyday interaction are to some part still unknown. During fMRI data acquisition, 37 participants were presented with video clips showing an actor speaking short sentences. The actor produced speech-associated iconic gestures (IC) or no gestures (NG) while he was visible either from an egocentric (ego) or from an allocentric (allo) position. Participants were asked to indicate via button press whether they felt addressed or not. We found a significant interaction of body orientation and gesture in addressment evaluations, indicating that participants evaluated IC-ego conditions as most addressing. The anterior cingulate cortex (ACC) and left fusiform gyrus were stronger activated for egocentric versus allocentric actor position in gesture context. Activation increase in the ACC for IC-ego>IC-allo further correlated positively with increased addressment ratings in the egocentric gesture condition. Gesture-related activation increase in the supplementary motor area, left inferior frontal gyrus and right insula correlated positively with gesture-related increase of addressment evaluations in the egocentric context. Results indicate that gesture use and body-orientation contribute to the feeling of being addressed and together influence neural processing in brain regions involved in motor simulation, empathy and mentalizing.
Abstractness and modality of interpersonal communication have a considerable impact on comprehension. They are relevant for determining thoughts and constituting internal models of the environment. Whereas concrete object-related information can be represented in mind irrespective of language, abstract concepts require a representation in speech. Consequently, modality-independent processing of abstract information can be expected. Here we investigated the neural correlates of abstractness (abstract vs. concrete) and modality (speech vs. gestures), to identify an abstractness-specific supramodal neural network. During fMRI data acquisition 20 participants were presented with videos of an actor either speaking sentences with an abstract-social [AS] or concrete-object-related content [CS], or performing meaningful abstract-social emblematic [AG] or concrete-object-related tool-use gestures [CG]. Gestures were accompanied by a foreign language to increase the comparability between conditions and to frame the communication context of the gesture videos. Participants performed a content judgment task referring to the person vs. object-relatedness of the utterances. The behavioral data suggest a comparable comprehension of contents communicated by speech or gesture. Furthermore, we found common neural processing for abstract information independent of modality (AS > CS ∩ AG > CG) in a left hemispheric network including the left inferior frontal gyrus (IFG), temporal pole, and medial frontal cortex. Modality specific activations were found in bilateral occipital, parietal, and temporal as well as right inferior frontal brain regions for gesture (G > S) and in left anterior temporal regions and the left angular gyrus for the processing of speech semantics (S > G). These data support the idea that abstract concepts are represented in a supramodal manner. Consequently, gestures referring to abstract concepts are processed in a predominantly left hemispheric language related neural network.
The semantic integration between gesture and speech (GSI) is mediated by the left posterior temporal sulcus/middle temporal gyrus (pSTS/MTG) and the left inferior frontal gyrus (IFG). Evidence from electroencephalography (EEG) suggests that oscillations in the alpha and beta bands may support processes at different stages of GSI. In the present study, we investigated the relationship between electrophysiological oscillations and blood-oxygen-level-dependent (BOLD) activity during GSI. In a simultaneous EEG-fMRI study, German participants (n = 19) were presented with videos of an actor either performing meaningful gestures in the context of a comprehensible German (GG) or incomprehensible Russian sentence (GR), or just speaking a German sentence (SG). EEG results revealed reduced alpha and beta power for the GG vs. SG conditions, while fMRI analyses showed BOLD increase in the left pSTS/MTG for GG > GR ∩ GG > SG. In time-window-based EEG-informed fMRI analyses, we further found a positive correlation between single-trial alpha power and BOLD signal in the left pSTS/MTG, the left IFG, and several sub-cortical regions. Moreover, the alpha-pSTS/MTG correlation was observed in an earlier time window in comparison to the alpha-IFG correlation, thus supporting a two-stage processing model of GSI. Our study shows that EEG-informed fMRI implies multiple roles of alpha oscillations during GSI, and that the method is a best candidate for multidimensional investigations on complex cognitive functions such as GSI.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.