Blind individuals have to rely on nonvisual information to a greater extent than sighted to efficiently interact with the environment, and consequently exhibit superior skills in their spared modalities. These performance advantages are often paralleled by responses in the occipital cortex, which have been suggested to be essential for nonvisual processing in the blind. However, it is currently unclear through which pathways (i.e., thalamocortical or corticocortical connections) nonvisual information reaches the occipital cortex of the blind. Here, we used functional magnetic resonance imaging to study blind and matched sighted humans with an auditory discrimination paradigm and used dynamic causal modeling to investigate the effective connectivity underlying auditory activations in the primary visual cortex of blind volunteers. Model comparison revealed that a model connecting the medial geniculate nucleus (MGN), primary auditory cortex (A1), and primary visual cortex (V1) in a bidirectional manner outperformed all other models in both groups. Regarding inference on model parameters, we observed that basic auditory mechanisms (i.e., sensory input to MGN and connections from MGN to A1) did not differ significantly between the two groups. In contrast, we found clear evidence for stronger corticocortical connections from A1 to V1 in the blind, whereas results with regard to thalamocortical enhancement (from MGN to V1 and, in a control analysis, from the lateral geniculate nucleus to V1) were not consistent. These results suggest that plastic changes especially in corticocortical connectivity allow auditory information to evoke responses in the primary visual cortex of blind individuals.
The speeding-up of neural processing associated with attended events (i.e., the prior-entry effect) has long been proposed as a viable mechanism by which attention can prioritize our perception and action. In the brain, this has been thought to be regulated through a sensory gating mechanism, increasing the amplitudes of early evoked potentials while leaving their latencies unaffected. However, the majority of previous research has emphasized speeded responding and has failed to emphasize fine temporal discrimination, thereby potentially lacking the sensitivity to reveal putative modulations in the timing of neural processing. In the present study, we used a cross-modal temporal order judgment task while shifting attention between the visual and tactile modalities to investigate the mechanisms underlying selective attention electrophysiologically. Our results indicate that attention can indeed speed up neural processes during visual perception, thereby providing the first electrophysiological support for the existence of prior entry.
Serotonin is implicated in many aspects of behavioral regulation. Theoretical attempts to unify the multiple roles assigned to serotonin proposed that it regulates the impact of costs, such as delay or punishment, on action selection. Here, we show that serotonin also regulates other types of action costs such as effort. We compared behavioral performance in 58 healthy humans treated during 8 weeks with either placebo or the selective serotonin reuptake inhibitor escitalopram. The task involved trading handgrip force production against monetary benefits. Participants in the escitalopram group produced more effort and thereby achieved a higher payoff. Crucially, our computational analysis showed that this effect was underpinned by a specific reduction of effort cost, and not by any change in the weight of monetary incentives. This specific computational effect sheds new light on the physiological role of serotonin in behavioral regulation and on the clinical effect of drugs for depression.Clinical trial Registration: ISRCTN75872983DOI: http://dx.doi.org/10.7554/eLife.17282.001
Emotional signals are of pivotal relevance in social interactions. Neuroimaging and lesion studies have established an important role of the amygdala for the processing of these signals. While the human amygdala receives input from all sensory modalities, it is the visual modality that is most important for emotional aspects in social interactions. Consequently, amygdala involvement in visual emotional processing has been unequivocally established, whereas its role in auditory emotional processing is less clear. To investigate amygdala involvement in auditory emotional processing, we used functional magnetic resonance imaging in sighted and connatally blind volunteers, the latter of which lack visual experience during development but have outstanding capabilities to process auditory signals, which are their dominant source of information in social interactions. First, we observed a performance advantage of the connatally blind in auditory discrimination tasks that was paralleled by occipital cortex activation, which was not present in the sighted. More importantly, the blind not only showed robust selective activation in the amygdala to fearful and angry compared to neutral voices but also showed stronger activation to those stimuli than sighted participants. Higher amygdala activity for fearful items was further associated with individual performance in the blind, indicating that amygdala activation in the blind is not only driven by blindness per se but also by inter-individual differences in auditory capabilities. Our results indicate that the responsivity of the amygdala to emotional signals develops even in the absence of visual emotional experience and serves the sensory modality which is the most reliable source of emotional information.
Functional magnetic resonance imaging (fMRI) studies have provided ample evidence for the involvement of the lateral occipital cortex (LO), fusiform gyrus (FG), and intraparietal sulcus (IPS) in visuo-haptic object integration. Here we applied 30 min of sham (non-effective) or real offline 1 Hz repetitive transcranial magnetic stimulation (rTMS) to perturb neural processing in left LO immediately before subjects performed a visuo-haptic delayed-match-to-sample task during fMRI. In this task, subjects had to match sample (S1) and target (S2) objects presented sequentially within or across vision and/or haptics in both directions (visual-haptic or haptic-visual) and decide whether or not S1 and S2 were the same objects. Real rTMS transiently decreased activity at the site of stimulation and remote regions such as the right LO and bilateral FG during haptic S1 processing. Without affecting behavior, the same stimulation gave rise to relative increases in activation during S2 processing in the right LO, left FG, bilateral IPS, and other regions previously associated with object recognition. Critically, the modality of S2 determined which regions were recruited after rTMS. Relative to sham rTMS, real rTMS induced increased activations during crossmodal congruent matching in the left FG for haptic S2 and the temporal pole for visual S2. In addition, we found stronger activations for incongruent than congruent matching in the right anterior parahippocampus and middle frontal gyrus for crossmodal matching of haptic S2 and in the left FG and bilateral IPS for unimodal matching of visual S2, only after real but not sham rTMS. The results imply that a focal perturbation of the left LO triggers modality-specific interactions between the stimulated left LO and other key regions of object processing possibly to maintain unimpaired object recognition. This suggests that visual and haptic processing engage partially distinct brain networks during visuo-haptic object matching.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.