Animals can use different sensory signals to localize objects in the environment. Depending on the situation, the brain either integrates information from multiple sensory sources or it chooses the modality conveying the most reliable information to direct behavior. This suggests that somehow, the brain has access to a modality-invariant representation of external space. Accordingly, neural structures encoding signals from more than one sensory modality are best suited for spatial information processing. In primates, the posterior parietal cortex (PPC) is a key structure for spatial representations. One substructure within human and macaque PPC is the ventral intraparietal area (VIP), known to represent visual, vestibular, and tactile signals. In the present study, we show for the first time that macaque area VIP neurons also respond to auditory stimulation. Interestingly, the strength of the responses to the acoustic stimuli greatly depended on the spatial location of the stimuli [i.e., most of the auditory responsive neurons had surprisingly small spatially restricted auditory receptive fields (RFs)].Given this finding, we compared the auditory RF locations with the respective visual RF locations of individual area VIP neurons. In the vast majority of neurons, the auditory and visual RFs largely overlapped. Additionally, neurons with well aligned visual and auditory receptive fields tended to encode multisensory space in a common reference frame. This suggests that area VIP constitutes a part of a neuronal circuit involved in the computation of a modality-invariant representation of external space.
The primary aim of this study was to determine the extent to which human MT+/V5, an extrastriate visual area known to mediate motion processing, is involved in visuomotor coordination. To pursue this we increased or decreased the excitability of MT+/V5, primary motor, and primary visual cortex by the application of 7 min of anodal and cathodal transcranial direct current stimulation (tDCS) in healthy human subjects while they were performing a visuomotor tracking task involving hand movements. The percentage of correct tracking movements increased specifically during and immediately after cathodal stimulation, which decreases cortical excitability, only when V5 was stimulated. None of the other stimulation conditions affected visuomotor performance. We propose that the improvement in performance caused by cathodal tDCS of V5 is due to a focusing effect on to the complex motion perception conditions involved in this task. This hypothesis was proven by additional experiments: Testing simple and complex motion perception in dot kinetograms, we found that a diminution in excitability induced by cathodal stimulation improved the subject's perception of the direction of the coherent motion only if this was presented among random dots (complex motion perception), and worsened it if only one motion direction was presented (simple movement perception). Our data suggest that area V5 is critically involved in complex motion perception and identification processes important for visuomotor coordination. The results also raise the possibility of the usefulness of tDCS in rehabilitation strategies for neurological patients with visuomotor disorders.
Performance of visuo-motor tasks requires the transfer of visual data to motor performance and depends highly on visual perception and cognitive processing, mainly during the learning phase. The primary aim of this study was to determine if the human middle temporal (MT)+/V5, an extrastriate visual area that is known to mediate motion processing, and the primary motor cortex are involved in learning of visuo-motor coordination tasks. To pursue this, we increased or decreased MT+/V5, primary contralateral motor (M1) and primary visual cortex excitability by 10 min of anodal or cathodal transcranial direct current stimulation in healthy human subjects during the learning phase of a visually guided tracking task. The percentage of correct tracking movements increased significantly in the early learning phase during anodal stimulation, but only when the left V5 or M1 was stimulated. Cathodal stimulation had no significant effect. Also, stimulation of the primary visual cortex was not effective for this kind of task. Our data suggest that the areas V5 and M1 are involved in the early phase of learning of visuo-motor coordination.
In monkeys, posterior parietal and premotor cortex play an important integrative role in polymodal motion processing. In contrast, our understanding of the convergence of senses in humans is only at its beginning. To test for equivalencies between macaque and human polymodal motion processing, we used functional MRI in normals while presenting moving visual, tactile, or auditory stimuli. Increased neural activity evoked by all three stimulus modalities was found in the depth of the intraparietal sulcus (IPS), ventral premotor, and lateral inferior postcentral cortex. The observed activations strongly suggest that polymodal motion processing in humans and monkeys is supported by equivalent areas. The activations in the depth of IPS imply that this area constitutes the human equivalent of macaque area VIP.
We studied the effect of eye position on visual and pursuit-related activity in neurons in the superior temporal sulcus of the macaque monkey. Altogether, 109 neurons from the middle temporal area (area MT) and the medial superior temporal area (area MST) were tested for influence of eye position on their stimulus-driven response in a fixation paradigm. In this paradigm the monitored eye position signal was superimposed onto the stimulus control signal while the monkey fixated at different locations on a screen. This setup guaranteed that an optimized stimulus was moved across the receptive field at the same retinal location for all fixation locations. For 61% of the MT neurons and 82% of the MST neurons the stimulus-induced response was modulated by the position of the eyes in the orbit. Directional selectivity was not influenced by eye position. One hundred sixty-eight neurons exhibited direction-specific responses during smooth tracking eye movements and were tested in a pursuit paradigm. Here the monkey had to track a target that started to move in the preferred direction with constant speed from five different locations on the screen in random order. Pursuit-related activity was modulated by eye position in 78% of the MT neurons as well as in 80% of the MST neurons tested. Neuronal activity varied linearly as a function of both horizontal and vertical eye position for most of the neurons tested in both areas, i.e., two-dimensional regression planes could be approximated to the responses of most of the neurons. The directions of the gradients of these regression planes correlated neither with the preferred stimulus direction tested in the fixation paradigm nor with the preferred tracking direction in the pursuit paradigm. Eighty-six neurons were tested with both the fixation and the pursuit paradigms. The directions of the gradients of the regression planes fit to the responses in both paradigms tended to correlate with each other, i.e., for more than two thirds of the neurons the angular difference between both directions was less than +/-90 degrees. The modulatory effect of the position of the eyes in the orbit proved to balance out at the population level for neurons in areas MT and MST, tested with the fixation as well as the pursuit paradigm. Results are discussed in light of the hypothesis of an ongoing coordinate transformation of the incoming sensory signals into a nonretinocentric representation of the visual field.
How does the brain process visual information about self-motion? In monkey cortex, the analysis of visual motion is performed by successive areas specialized in different aspects of motion processing. Whereas neurons in the middle temporal (MT) area are direction-selective for local motion, neurons in the medial superior temporal (MST) area respond to motion patterns. A neural network model attempts to link these properties to the psychophysics of human heading detection from optic flow. It proposes that populations of neurons represent specific directions of heading. We quantitatively compared single-unit recordings in area MST with single-neuron simulations in this model. Predictions were derived from simulations and subsequently tested in recorded neurons. Neuronal activities depended on the position of the singular point in the optic flow. Best responses to opposing motions occurred for opposite locations of the singular point in the visual field. Excitation by one type of motion is paired with inhibition by the opposite motion. Activity maxima often occur for peripheral singular points. The averaged recorded shape of the response modulations is sigmoidal, which is in agreement with model predictions. We also tested whether the activity of the neuronal population in MST can represent the directions of heading in our stimuli. A simple least-mean-square minimization could retrieve the direction of heading from the neuronal activities with a precision of 4.3 degrees. Our results show good agreement between the proposed model and the neuronal responses in area MST and further support the hypothesis that area MST is involved in visual navigation.
Navigation in space requires the brain to combine information arising from different sensory modalities with the appropriate motor commands. Sensory information about self-motion in particular is provided by the visual and the vestibular system. The macaque ventral intraparietal area (VIP) has recently been shown to be involved in the processing of self-motion information provided by optical flow, to contain multimodal neurons and to receive input from areas involved in the analysis of vestibular information. By studying responses to linear vestibular, visual and bimodal stimulation we aimed at gaining more insight into the mechanisms involved in multimodal integration and self-motion processing. A large proportion of cells (77%) revealed a significant response to passive linear translation of the monkey. Of these cells, 59% encoded information about the direction of self-motion. The phase relationship between vestibular stimulation and neuronal responses covered a broad spectrum, demonstrating the complexity of the spatio-temporal pattern of vestibular information encoded by neurons in area VIP. For 53% of the direction-selective neurons the preferred directions for stimuli of both modalities were the same; they were opposite for the remaining 47% of the neurons. During bimodal stimulation the responses of neurons with opposite direction selectivity in the two modalities were determined either by the visual (53%) or the vestibular (47%) modality. These heterogeneous responses to unimodal and bimodal stimulation might be used to prevent misjudgements about self- and/or object-motion, which could be caused by relying on information of one sensory modality alone.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.