Affective and social processes play a major role in everyday life, but appropriate methods to assess disturbances in these processes after brain lesions are still lacking. Past studies have shown that amygdala damage can impair recognition of facial expressions, particularly fear, as well as processing of gaze direction; but the mechanisms responsible for these deficits remain debated. Recent accounts of human amygdala function suggest that it is a critical structure involved in self-relevance appraisal. According to such accounts, responses to a given facial expression may vary depending on concomitant gaze direction and perceived social meaning. Here we investigated facial emotion recognition and its interaction with gaze in patients with unilateral amygdala damage (n = 19), compared to healthy controls (n = 10), using computer-generated dynamic face stimuli expressing variable intensities of fear, anger or joy, with different gaze directions (direct versus averted). If emotion perception is influenced by the self-relevance of expression based on gaze direction, a fearful face with averted gaze should be more relevant than the same expression with direct gaze because it signals danger near the observer; whereas anger with direct gaze should be more relevant than with averted gaze because it directly threatens the observer. Our results confirm a critical role for the amygdala in self-relevance appraisal, showing an interaction between gaze and emotion in healthy controls, a trend for such interaction in left-damaged patients but not in right-damaged patients. Impaired expression recognition was generally more severe for fear, but with a greater deficit for right versus left damage. These findings do not only provide new insights on human amygdala function, but may also help design novel neuropsychological tests sensitive to amygdala dysfunction in various patient populations.
We tested whether human amygdala lesions impair vocal processing in intact cortical networks. In two functional MRI experiments, patients with unilateral amygdala resection either listened to voices and nonvocal sounds or heard binaural vocalizations with attention directed toward or away from emotional information on one side. In experiment 1, all patients showed reduced activation to voices in the ipsilesional auditory cortex. In experiment 2, emotional voices evoked increased activity in both the auditory cortex and the intact amygdala for right-damaged patients, whereas no such effects were found for left-damaged amygdala patients. Furthermore, the left inferior frontal cortex was functionally connected with the intact amygdala in right-damaged patients, but only with homologous right frontal areas and not with the amygdala in left-damaged patients. Thus, unilateral amygdala damage leads to globally reduced ipsilesional cortical voice processing, but only left amygdala lesions are sufficient to suppress the enhanced auditory cortical processing of vocal emotions.
We tested 125 normal subjects and 24 right and 22 left focal brain-damaged patients (RBD and LBD) on the Rey figure copying test and on a battery of perceptual and representational visuospatial tasks, in search of relationships between constructional and visuospatial abilities. Selected RBD and LBD were not affected by severe aphasia, unilateral spatial neglect or general intellectual defects. Both RBD and LBD showed defective performances on the constructional task with respect to normal subjects. As regards visuospatial tasks, both patient groups scored lower than normal subjects in judging angle width and mentally assembling geometrical figures; moreover, RBD, but not LBD, achieved scores significantly lower than healthy controls in judging line orientation and analyzing geometrical figures. Post-hoc comparisons did not reveal any significant differences between RBD and LBD. Multiple regression analysis showed that visuospatial abilities correlate with accuracy in copying geometrical drawings in normal subjects and in RBD, but not in LBD. From a theoretical perspective, these findings support the idea that visual perceptual and representational abilities do play a role in constructional skills.
In the context of emotion information processing, several studies have demonstrated the involvement of the amygdala in emotion perception, for unimodal and multimodal stimuli. However, it seems that not only the amygdala, but several regions around it, may also play a major role in multimodal emotional integration. In order to investigate the contribution of these regions to multimodal emotion perception, five patients who had undergone unilateral anterior temporal lobe resection were exposed to both unimodal (vocal or visual) and audiovisual emotional and neutral stimuli. In a classic paradigm, participants were asked to rate the emotional intensity of angry, fearful, joyful, and neutral stimuli on visual analog scales. Compared with matched controls, patients exhibited impaired categorization of joyful expressions, whether the stimuli were auditory, visual, or audiovisual. Patients confused joyful faces with neutral faces, and joyful prosody with surprise. In the case of fear, unlike matched controls, patients provided lower intensity ratings for visual stimuli than for vocal and audiovisual ones. Fearful faces were frequently confused with surprised ones. When we controlled for lesion size, we no longer observed any overall difference between patients and controls in their ratings of emotional intensity on the target scales. Lesion size had the greatest effect on intensity perceptions and accuracy in the visual modality, irrespective of the type of emotion. These new findings suggest that a damaged amygdala, or a disrupted bundle between the amygdala and the ventral part of the occipital lobe, has a greater impact on emotion perception in the visual modality than it does in either the vocal or audiovisual one. We can surmise that patients are able to use the auditory information contained in multimodal stimuli to compensate for difficulty processing visually conveyed emotion.
Battery for Visuospatial Abilities (BVA, known in Italy as TeRaDiC) has been developed to analyse putative basic skills involved in drawing and to plan and monitor outcomes after rehabilitation of visuoconstructional disorders. It encompasses eight tasks assessing both simple "perceptual" abilities, such as line length and line orientation judgments and complex "representational" abilities, such as mental rotation. The aim of present study was to provide normative values for BVA collected in a wide sample of healthy Italian subjects. Three hundred seventeen healthy Italian subjects (173 women and 144 men) of different age classes (age range, 40-95 years) and education level (from primary to university), with a normal score on Mini Mental State Examination, completed BVA/TeRaDiC. Multiple linear regression analysis revealed that age and education significantly influenced performance on most tests of the BVA/TeRaDiC; only line length judgment was not affected by educational level. Gender significantly affected line orientation judgment and mental rotation, with an advantage for males in both tests. From the derived linear equations, a correction grid for adjusting BVA/TeRaDiC raw scores was built. Using a non-parametric technique, inferential cut-off scores were determined and equivalent scores computed. The present study provided Italian normative data for the BVA/TeRaDiC useful for both clinical and research purposes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.