Sensorimotor co-ordination in mammals is achieved predominantly via the activity of the basal ganglia. To investigate the underlying multisensory information processing, we recorded the neuronal responses in the caudate nucleus (CN) and substantia nigra (SN) of anaesthetized cats to visual, auditory or somatosensory stimulation alone and also to their combinations, i.e. multisensory stimuli. The main goal of the study was to ascertain whether multisensory information provides more information to the neurons than do the individual sensory components. A majority of the investigated SN and CN multisensory units exhibited significant cross-modal interactions. The multisensory response enhancements were either additive or superadditive; multisensory response depressions were also detected. CN and SN cells with facilitatory and inhibitory interactions were found in each multisensory combination. The strengths of the multisensory interactions did not differ in the two structures. A significant inverse correlation was found between the strengths of the best unimodal responses and the magnitudes of the multisensory response enhancements, i.e. the neurons with the weakest net unimodal responses exhibited the strongest enhancement effects. The onset latencies of the responses of the integrative CN and SN neurons to the multisensory stimuli were significantly shorter than those to the unimodal stimuli. These results provide evidence that the multisensory CN and SN neurons, similarly to those in the superior colliculus and related structures, have the ability to integrate multisensory information. Multisensory integration may help in the effective processing of sensory events and the changes in the environment during motor actions controlled by the basal ganglia.
Visual single-unit activity was recorded in the caudate nucleus of halothane-anaesthetized, immobilized, artificially respirated cats. Visually sensitive neurons were found in the dorsolateral part of the caudate body. A majority of the units responded optimally to small spot-like stimuli moving with velocities between 30 and 120 degrees /s. The receptive field of these units is large: it covers a major part of both the contra- and ipsilateral visual hemifields. No signs of retinotopy were observed. Most of the neurons display directional selectivity and are narrowly tuned to the direction of the moving stimulus. These physiological properties are consistent with recent morphological results that reveal multiple connections of the caudate nucleus with the superior colliculus through tecto-extrageniculo-thalamic pathways in the mammalian brain.
The spatio-temporal frequency response profiles of 73 neurons located in the superficial, retino-recipient layers of the feline superior colliculus (SC) were investigated. The majority of the SC cells responded optimally to very low spatial frequencies with a mean of 0.1 cycles/degree (c/deg). The spatial resolution was also low with a mean of 0.31 c/deg. The spatial frequency tuning functions were either low-pass or band-pass with a mean spatial frequency bandwidth of 1.84 octaves. The cells responded optimally to a range of temporal frequencies between 0.74 cycles/s (c/s) and 26.41 c/s with a mean of 6.84 c/s. The majority (68%) of the SC cells showed band-pass temporal frequency tuning with a mean temporal frequency bandwidth of 2.4 octaves, while smaller proportions of the SC units displayed high-pass (19%), low-pass (8%) or broad-band (5%) temporal tuning. Most of the SC units exhibited simple spectral tuning with a single maximum in the spatio-temporal frequency domain, while some neurons were tuned for spatial or temporal frequencies or speed tuned. Further, we found cells excited by gratings moving at high temporal and low spatial frequencies and cells whose activity was suppressed by high velocity movement. The spatio-temporal filter properties of the SC neurons show close similarities to those of their retinal Y and W inputs as well as those of their inputs from the cortical visual motion detector areas, suggesting their common role in motion analysis and related behavioral actions.
Associative learning is a basic cognitive function by which discrete and often different percepts are linked together. The Rutgers Acquired Equivalence Test investigates a specific kind of associative learning, visually guided equivalence learning. The test consists of an acquisition (pair learning) and a test (rule transfer) phase, which are associated primarily with the function of the basal ganglia and the hippocampi, respectively. Earlier studies described that both fundamentally-involved brain structures in the visual associative learning, the basal ganglia and the hippocampi, receive not only visual but also multisensory information. However, no study has investigated whether there is a priority for multisensory guided equivalence learning compared to unimodal ones. Thus we had no data about the modality-dependence or independence of the equivalence learning. In the present study, we have therefore introduced the auditory- and multisensory (audiovisual)-guided equivalence learning paradigms and investigated the performance of 151 healthy volunteers in the visual as well as in the auditory and multisensory paradigms. Our results indicated that visual, auditory and multisensory guided associative learning is similarly effective in healthy humans, which suggest that the acquisition phase is fairly independent from the modality of the stimuli. On the other hand, in the test phase, where participants were presented with acquisitions that were learned earlier and associations that were until then not seen or heard but predictable, the multisensory stimuli elicited the best performance. The test phase, especially its generalization part, seems to be a harder cognitive task, where the multisensory information processing could improve the performance of the participants.
The spatial and temporal visual sensitivity to drifting sinusoidal gratings was studied in 75 neurons of the feline anterior ectosylvian visual area (AEV). Extracellular single-unit recordings were performed in halothane-anesthetized (0.6%), immobilized, artificially ventilated cats. Most cells were strongly sensitive to the direction of drifting gratings. The mean value of the direction tuning widths was approximately 90 deg. Most of the cells (69 of the 75 cases) displayed rather narrowly tuned band-pass characteristics in the low spatial frequency range, with a mean optimal spatial frequency of 0.2 cycles/degree (c/deg). The mean spatial bandwidth was 1.4 octaves. The remainder of the units was low-pass tuned. A majority of the units responded optimally to high temporal frequencies (mean 6.3 Hz), although some cells did exhibit preferences for every examined temporal frequency between 0.6 Hz and 10.8 Hz. The temporal frequency-tuning functions mostly revealed a band-pass character with a mean temporal bandwidth of 1.1 octaves. Our results demonstrate that the neurons along the anterior ectosylvian sulcus display particular spatial and temporal characteristics. The AEV neurons, with their preference for low spatial frequencies and with their fine spatial and temporal tuning properties, seem to be candidates for special tasks in motion perception.
This study describes a possible mechanism of coding of multisensory information in the anterior ectosylvian visual area of the feline cortex. Extracellular microelectrode recordings on 168 cells were carried out in the anterior ectosylvian sulcal region of halothane-anaesthetized, immobilized, artificially ventilated cats. Ninety-five neurons were found to respond to visual stimuli, 96 responded to auditory stimuli and 45 were bimodal, reacting to both visual and auditory modalities. A large proportion of the neurons exhibited significantly different responses to stimuli appearing in different regions of their huge receptive field. These neurons have the ability to provide information via their discharge rate on the site of the stimulus within their receptive field. This suggests that they may serve as panoramic localizers. The ability of the bimodal neurons to localize bimodal stimulus sources is better than any of the unimodal localizing functions. Further, the sites of maximal responsivity of the visual, auditory and bimodal neurons are distributed over the whole extent of the large receptive fields. Thus, a large population of such panoramic visual, auditory and multisensory neurons could accurately code the locations of the sensory stimuli. Our findings support the notion that there is a distributed population code of multisensory information in the feline associative cortex.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.