Despite reports of improved auditory discrimination capabilities in blind humans and visually deprived animals, there is no general agreement as to the nature or pervasiveness of such compensatory sensory enhancements. Neuroimaging studies have pointed out differences in cerebral organization between blind and sighted humans, but the relationship between these altered cortical activation patterns and auditory sensory acuity remains unclear. Here we compare behavioural and electrophysiological indices of spatial tuning within central and peripheral auditory space in congenitally blind and normally sighted but blindfolded adults to test the hypothesis (raised by earlier studies of the effects of auditory deprivation on visual processing) that the effects of visual deprivation might be more pronounced for processing peripheral sounds. We find that blind participants displayed localization abilities that were superior to those of sighted controls, but only when attending to sounds in peripheral auditory space. Electrophysiological recordings obtained at the same time revealed sharper tuning of early spatial attention mechanisms in the blind subjects. Differences in the scalp distribution of brain electrical activity between the two groups suggest a compensatory reorganization of brain areas in the blind that may contribute to the improved spatial resolution for peripheral sound sources.
Researchers have known for more than a century that crossing the hands can impair both tactile perception and the execution of appropriate finger movements. Sighted people find it more difficult to judge the temporal order when two tactile stimuli, one applied to either hand, are presented and their hands are crossed over the midline as compared to when they adopt a more typical uncrossed-hands posture. It has been argued that because of the dominant role of vision in motor planning and execution, tactile stimuli are remapped into externally defined coordinates (predominantly determined by visual inputs) that takes longer to achieve when external and body-centered codes (determined primarily by somatosensory/proprioceptive inputs) are in conflict and that involves both multisensory parietal and visual cortex. Here, we show that the performance of late, but not of congenitally, blind people was impaired by crossing the hands. Moreover, we provide the first empirical evidence for superior temporal order judgments (TOJs) for tactile stimuli in the congenitally blind. These findings suggest a critical role of childhood vision in modulating the perception of touch that may arise from the emergence of specific crossmodal links during development.
Neurophysiological recordings and neuroimaging data in blind and deaf animals and humans suggest that perceptual functions may be organized differently after sensory deprivation. It has been argued that neural plasticity contributes to compensatory performance in blind humans, such as faster speech processing. The present study employed functional magnetic resonance imaging (fMRI) to map language-related brain activity in congenitally blind adults. Participants listened to sentences, with either an easy or a more difficult syntactic structure, which were either semantically meaningful or meaningless. Results show that blind adults not only activate classical left-hemispheric perisylvian language areas during speech comprehension, as did a group of sighted adults, but that they additionally display an activation in the homologueous right-hemispheric structures and in extrastriate and striate cortex. Both the perisylvian and occipital activity varied as a function of syntactic difficulty and semantic content. The results demonstrate that the cerebral organization of complex cognitive systems such as the language system is significantly shaped by the input available.
The present study investigated with event-related potentials whether attending to a moment in time modulates the processing of auditory stimuli at a similar early, perceptual level as attending to a location in space. The participants listened to short (600 ms) and long (1,200 ms) intervals marked by white noise bursts. The task was to attend in alternating runs either to the short or to the long intervals and to respond to rare offset markers that differed in intensity from the frequent standard offset markers. Prior to the to-be-attended moment, a slow negative potential developed over the frontal scalp. Stimuli presented at the attended compared to the unattended moments in time elicited an enhanced N1 and an enhanced posteriorly distributed positivity (300-370 ms). The results show that attention can be flexibly controlled in time and that not only late but also early perceptual processing stages are modulated by attending to a moment in time.
Arithmetic facts are stored in densely interconnected memory networks, and retrieval errors may occur because activation spreads to associated results. We studied the extension of activation spread by means of the so-called N400 effect of the event-related brain potential (ERP). With semantic stimuli, N400 amplitude has proved to be inversely proportional to the amount of activation that originates from a priming context. ERPs were recorded from 61 scalp positions while 16 subjects verified 600 multiplication problems (a × b = c). The solution to each problem could be correct or incorrect. Incorrect solutions were either table related to one of the operands (e.g., 5 × 8 = 32, 24, or 16) or unrelated (e.g., 5 × 8 = 34, 26, or 18), and were either a small, medium, or large numerical distance from the correct product. Our findings suggest that activation spread in an arithmetic memory network is restricted to numbers that are table related to one of the operands and that are numerically plausible.Results of simple multiplication problems are retrieved from declarative memory as well-established facts. Retrieval models (Campbell & Graham, 1985;Stazyk, Ashcraft, & Hamann, 1982) assume that each operand and the problem as a whole trigger an automatic spread of activation within a densely interconnected memory network. The total activation creates a candidate set of answers, and the most strongly activated answer node triggers the response. This basic idea can explain several findings obtained with tasks involving retrieval of arithmetic facts. For example, in a verification task, participants have to decide on the correctness of a multiplication problem. A basic finding is that decision time is shorter and error rate smaller for unrelated errors (3 × 8 = 34) than for related errors (3 × 8 = 32; i.e., errors that are multiples of either the first or the second operand). This can be attributed to the fact that solutions associated with one operand have a higher competitive activation level than nonassociated solutions.Another significant factor that affects decision time in arithmetic verification tasks is the numerical distance between a correct and an incorrect solution (3 × 8 = 32 vs. 3 × 8 = 48). Reaction time (RT) to incorrect equations becomes smaller with increasing numerical distance from the correct solution (Ashcraft & Stazyk, 1981). Up to now, it has not been clear if this effect of numerical distance has the same functional basis as the relatedness effect. It could be that the relatedness effect is due to activation spread and the distance effect is due to a process that estimates the plausibility of the size of a solution (see, e.g., Zbrodoff & Logan, 1986). It is difficult to tease such factors apart by means of RT alone, because RT represents the accumulation of the effects of all delaying factors on a task.Studies with semantic materials have revealed that the strength of activation in memory networks can be measured by using eventrelated brain potentials (ERPs; Kutas & Hillyard, 1980;Kutas & Van Petten,...
We report a series of event-related potential experiments designed to dissociate the functionally distinct processes involved in the comprehension of highly restricted lexical-semantic relations (antonyms). We sought to differentiate between influences of semantic relatedness (which are independent of the experimental setting) and processes related to predictability (which differ as a function of the experimental environment). To this end, we conducted three ERP studies contrasting the processing of antonym relations (black-white) with that of related (black-yellow) and unrelated (black-nice) word pairs. Whereas the lexical-semantic manipulation was kept constant across experiments, the experimental environment and the task demands varied: Experiment 1 presented the word pairs in a sentence context of the form The opposite of X is Y and used a sensicality judgment. Experiment 2 used a word pair presentation mode and a lexical decision task. Experiment 3 also examined word pairs, but with an antonymy judgment task. All three experiments revealed a graded N400 response (unrelated > related > antonyms), thus supporting the assumption that semantic associations are processed automatically. In addition, the experiments revealed that, in highly constrained task environments, the N400 gradation occurs simultaneously with a P300 effect for the antonym condition, thus leading to the superficial impression of an extremely "reduced" N400 for antonym pairs. Comparisons across experiments and participant groups revealed that the P300 effect is not only a function of stimulus constraints (i.e., sentence context) and experimental task, but that it is also crucially influenced by individual processing strategies used to achieve successful task performance.
Animal studies have shown that visual deprivation during the first months of life permanently impairs the interactions between sensory systems. Here we report an analogous effect for humans who had been deprived of pattern vision for at least the first five months of their life as a result of congenital binocular cataracts. These patients showed reduced audio-visual interactions in later life, although their visual performance in control tasks was unimpaired. Thus, adequate (multisensory) input during the first months of life seems to be a prerequisite in humans, as well as in animals, for the full development of cross-modal interactions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.