Functional magnetic resonance imaging was used to assess the cortical areas active during the observation of mouth actions performed by humans and by individuals belonging to other species (monkey and dog). Two types of actions were presented: biting and oral communicative actions (speech reading, lip-smacking, barking). As a control, static images of the same actions were shown. Observation of biting, regardless of the species of the individual performing the action, determined two activation foci (one rostral and one caudal) in the inferior parietal lobule and an activation of the pars opercularis of the inferior frontal gyrus and the adjacent ventral premotor cortex. The left rostral parietal focus (possibly BA 40) and the left premotor focus were very similar in all three conditions, while the right side foci were stronger during the observation of actions made by conspecifics. The observation of speech reading activated the left pars opercularis of the inferior frontal gyrus, the observation of lip-smacking activated a small focus in the pars opercularis bilaterally, and the observation of barking did not produce any activation in the frontal lobe. Observation of all types of mouth actions induced activation of extrastriate occipital areas. These results suggest that actions made by other individuals may be recognized through different mechanisms. Actions belonging to the motor repertoire of the observer (e.g., biting and speech reading) are mapped on the observer's motor system. Actions that do not belong to this repertoire (e.g., barking) are essentially recognized based on their visual properties. We propose that when the motor representation of the observed action is activated, the observer gains knowledge of the observed action in a "personal" perspective, while this perspective is lacking when there is no motor activation.
Early-onset right-sided mesial temporal lobe epilepsy is the key substrate determining a severe deficit in recognizing emotional facial expressions, especially fear.
Neurons involved in grasp preparation with hand and mouth were previously recorded in the premotor cortex of monkey. The aim of the present kinematic study was to determine whether a unique planning underlies the act of grasping with hand and mouth in humans as well. In a set of four experiments, healthy subjects reached and grasped with the hand an object of different size while opening the mouth (experiments 1 and 3), or extending the other forearm (experiment 4), or the fingers of the other hand (experiment 5). In a subsequent set of three experiments, subjects grasped an object of different size with the mouth, while opening the fingers of the right hand (experiments 6-8). The initial kinematics of mouth and finger opening, but not of forearm extension, was affected by the size of the grasped object congruently with the size effect on initial grasp kinematics. This effect was due neither to visual presentation of the object, without the successive grasp motor act (experiment 2) nor to synchronism between finger and mouth opening (experiments 3, 7, and 8). In experiment 9 subjects grasped with the right hand an object of different size while pronouncing a syllable printed on the target. Mouth opening and sound production were affected by the grasped object size. The results of the present study are discussed according to the notion that in an action each motor act is prepared before the beginning of the motor sequence. Double grasp preparation can be used for successive motor acts on the same object as, for example, grasping food with the hand and ingesting it after bringing it to the mouth. We speculate that the circuits involved in double grasp preparation might have been the neural substrate where hand motor patterns used as primitive communication signs were transferred to mouth articulation system. This is in accordance with the hypothesis that Broca's area derives phylogenetically from the monkey premotor area where hand movements are controlled.
Humor is a unique ability in human beings. Suls [A two-stage model for the appreciation of jokes and cartoons. In P. E. Goldstein & J. H. McGhee (Eds.), The psychology of humour. Theoretical perspectives and empirical issues. New York: Academic Press, 1972, pp. 81-100] proposed a two-stage model of humor: detection and resolution of incongruity. Incongruity is generated when a prediction is not confirmed in the final part of a story. To comprehend humor, it is necessary to revisit the story, transforming an incongruous situation into a funny, congruous one. Patient and neuroimaging studies carried out until now lead to different outcomes. In particular, patient studies found that right brain-lesion patients have difficulties in humor comprehension, whereas neuroimaging studies suggested a major involvement of the left hemisphere in both humor detection and comprehension. To prevent activation of the left hemisphere due to language processing, we devised a nonverbal task comprising cartoon pairs. Our findings demonstrate activation of both the left and the right hemispheres when comparing funny versus nonfunny cartoons. In particular, we found activation of the right inferior frontal gyrus (BA 47), the left superior temporal gyrus (BA 38), the left middle temporal gyrus (BA 21), and the left cerebellum. These areas were also activated in a nonverbal task exploring attribution of intention [Brunet, E., Sarfati, Y., Hardy-Bayle, M. C., & Decety, J. A PET investigation of the attribution of intentions with a nonverbal task. Neuroimage, 11, 157-166, 2000]. We hypothesize that the resolution of incongruity might occur through a process of intention attribution. We also asked subjects to rate the funniness of each cartoon pair. A parametric analysis showed that the left amygdala was activated in relation to subjective amusement. We hypothesize that the amygdala plays a key role in giving humor an emotional dimension.
Looking at still images of body parts in situations that are likely to cause pain has been shown to be associated with activation in some brain areas involved in pain processing. Because pain involves both sensory components and negative affect, it is of interest to explore whether the visually evoked representations of pain and of other negative emotions overlap. By means of event-related functional magnetic resonance imaging, here we compare the brain areas recruited, in female volunteers, by the observation of painful, disgusting, or neutral stimuli delivered to one hand or foot. Several cortical foci were activated by the observation of both painful and disgusting video clips, including portions of the medial prefrontal cortex, anterior, mid-, and posterior cingulate cortex, left posterior insula, and right parietal operculum. Signal changes in perigenual cingulate and left anterior insula were linearly related to the perceived unpleasantness, when the individual differences in susceptibility to aversive stimuli were taken into account. Painful scenes selectively induced activation of left parietal foci, including the parietal operculum, the postcentral gyrus, and adjacent portions of the posterior parietal cortex. In contrast, brain foci specific for disgusting scenes were found in the posterior cingulate cortex. These data show both similarities and differences between the brain patterns of activity related to the observation of noxious or disgusting stimuli. Namely, the parietal cortex appears to be particularly involved in the recognition of noxious environmental stimuli, suggesting that areas involved in sensory aspects of pain are specifically triggered by observing noxious events.
Face emotion processing is impaired in PD patients, with a disproportionate deficit involving fear and sadness. The pattern of face expression processing impairment in PD patients might depend on the regional distribution of the pathology. The widespread involvement of both emotional and propositional prosodic processing parallels the aprosodic characteristics of Parkinsonian speech production.
Crohn's disease is associated with brain morphological changes in cortical and subcortical structures involved in nociception, emotional, and cognitive processes. Our findings provide new insight into the brain involvement in chronic inflammatory bowel disorders.
SUMMARYPurpose: To evaluate facial emotion recognition (FER) in a cohort of 176 patients with chronic temporal lobe epilepsy (TLE). Methods: FER was tested by matching facial expressions with the verbal labels for the following basic emotions: happiness, sadness, fear, disgust, and anger. Emotion recognition performances were analyzed in medial (n = 140) and lateral (n = 36) TLE groups. Fifty healthy subjects served as controls. The clinical and neuroradiologic variables potentially affecting the ability to recognize facial expressions were taken into account. Results: The medial TLE (MTLE) group showed impaired FER (86% correct recognition) compared to both the lateral TLE patients (FER = 93.5%) and the controls (FER = 96.4%), with 42% of MTLE patients recording rates of FER that were lower [by at least 2 standard deviations (SDs)] than the control mean. The MTLE group was impaired compared to the healthy controls in the recognition of all basic facial expressions except happiness. The patients with bilateral MTLE were the most severely impaired, followed by the right and then the left MTLE patients. FER was not affected by type of lesion, number of antiepileptic drugs (AEDs), aura semiology, or gender. Conversely, the early onset of seizures/ epilepsy was related to FER deficits. These deficits were already established in young adulthood, with no evidence of progression in older MTLE patients. Conclusion: These results on a large cohort of TLE patients demonstrate that emotion recognition deficits are common in MTLE patients and widespread across negative emotions. We confirm that early onset seizures with right or bilateral medial temporal dysfunction lead to severe deficits in recognizing facial expressions of emotions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.