Animals can use different sensory signals to localize objects in the environment. Depending on the situation, the brain either integrates information from multiple sensory sources or it chooses the modality conveying the most reliable information to direct behavior. This suggests that somehow, the brain has access to a modality-invariant representation of external space. Accordingly, neural structures encoding signals from more than one sensory modality are best suited for spatial information processing. In primates, the posterior parietal cortex (PPC) is a key structure for spatial representations. One substructure within human and macaque PPC is the ventral intraparietal area (VIP), known to represent visual, vestibular, and tactile signals. In the present study, we show for the first time that macaque area VIP neurons also respond to auditory stimulation. Interestingly, the strength of the responses to the acoustic stimuli greatly depended on the spatial location of the stimuli [i.e., most of the auditory responsive neurons had surprisingly small spatially restricted auditory receptive fields (RFs)].Given this finding, we compared the auditory RF locations with the respective visual RF locations of individual area VIP neurons. In the vast majority of neurons, the auditory and visual RFs largely overlapped. Additionally, neurons with well aligned visual and auditory receptive fields tended to encode multisensory space in a common reference frame. This suggests that area VIP constitutes a part of a neuronal circuit involved in the computation of a modality-invariant representation of external space.
In monkeys, posterior parietal and premotor cortex play an important integrative role in polymodal motion processing. In contrast, our understanding of the convergence of senses in humans is only at its beginning. To test for equivalencies between macaque and human polymodal motion processing, we used functional MRI in normals while presenting moving visual, tactile, or auditory stimuli. Increased neural activity evoked by all three stimulus modalities was found in the depth of the intraparietal sulcus (IPS), ventral premotor, and lateral inferior postcentral cortex. The observed activations strongly suggest that polymodal motion processing in humans and monkeys is supported by equivalent areas. The activations in the depth of IPS imply that this area constitutes the human equivalent of macaque area VIP.
Navigation in space requires the brain to combine information arising from different sensory modalities with the appropriate motor commands. Sensory information about self-motion in particular is provided by the visual and the vestibular system. The macaque ventral intraparietal area (VIP) has recently been shown to be involved in the processing of self-motion information provided by optical flow, to contain multimodal neurons and to receive input from areas involved in the analysis of vestibular information. By studying responses to linear vestibular, visual and bimodal stimulation we aimed at gaining more insight into the mechanisms involved in multimodal integration and self-motion processing. A large proportion of cells (77%) revealed a significant response to passive linear translation of the monkey. Of these cells, 59% encoded information about the direction of self-motion. The phase relationship between vestibular stimulation and neuronal responses covered a broad spectrum, demonstrating the complexity of the spatio-temporal pattern of vestibular information encoded by neurons in area VIP. For 53% of the direction-selective neurons the preferred directions for stimuli of both modalities were the same; they were opposite for the remaining 47% of the neurons. During bimodal stimulation the responses of neurons with opposite direction selectivity in the two modalities were determined either by the visual (53%) or the vestibular (47%) modality. These heterogeneous responses to unimodal and bimodal stimulation might be used to prevent misjudgements about self- and/or object-motion, which could be caused by relying on information of one sensory modality alone.
The pictorial content of visual memories recalled by association is embodied by neuronal activity at the highest processing stages of primate visual cortex. This activity is elicited by top-down signals from the frontal lobe and recapitulates the bottom-up pattern normally obtained by the recalled stimulus. To explore the generality and mechanisms of this phenomenon, we recorded motion-sensitive neurons at an early stage of cortical processing. After monkeys learned to associate directions of motion with static shapes, these neurons exhibited unprecedented selectivity for the shapes. This emergent shape selectivity reflects activation of neurons representing the motion stimuli recalled by association, and it suggests that recall-related activity may be a general feature of neurons in visual cortex.
In the posterior parietal cortex (PPC) of the macaque, spatial and motion signals arising from different sensory signals converge. One of the functional subregions within the PPC, the ventral intraparietal area (VIP), is thought to play an important role for the multisensory encoding of self‐ and object motion. In the present study we analysed the activity of area VIP neurons related to smooth pursuit eye movements (SPEMs). Fifty‐three per cent of the neurons (123/234) were selective for the direction of the SPEMs. As evident from control experiments, activity observed during smooth eye movements was more closely related to extraretinal signals than visual parameters. In addition, we examined the sensitivity of area VIP neurons for the velocity of SPEMs. Seventy‐four per cent of the pursuit‐related neurons had a significant velocity tuning. There was a clear preference for high velocities. Eighty‐six per cent of the neurons preferred the highest pursuit velocity (40 deg s−1) employed in our study. In everyday life, high pursuit velocities most frequently occur if the pursuit target is located in near‐extrapersonal space, i.e. the action space of the head. Together with previous findings, the current results thus suggest that the information provided by VIP neurons may be used to encode motion in near‐extrapersonal space and to guide and co‐ordinate smooth eye and head movements within this very part of space.
Visual motion processing plays a key role in enabling primates' successful interaction with their dynamic environments. Although in natural environments the speed of visual stimuli continuously varies, speed tuning of neurons in the prototypical motion area MT has traditionally been assessed with stimuli that moved at constant speeds. We investigated whether the representation of speed in a continuously varying stimulus context differs from the representation of constant speeds. We recorded from individual MT neurons of fixating macaques while stimuli moved either at a constant speed or in a linearly accelerating or decelerating manner. We found clear speed tuning even when the stimulus consisted of visual motion with gradual speed changes. There were, however, important differences with the speed tuning as measured with constant stimuli: the stimulus context affected neuronal preferred speed as well as the associated tuning width of the speed tuning curves. These acceleration-dependent changes in response lead to an accurate representation of the acceleration of these stimuli in the MT cells. To elucidate the mechanistic basis of this signal, we constructed a stochastic firing rate model based on the constant speed response profiles. This model incorporated each cell's speed tuning and response adaptation dynamics and accurately predicted the response to constant speeds as well as accelerating and decelerating stimuli. Because the response of the model neurons had no explicit acceleration dependence, we conclude that speed-dependent adaptation creates a strong influence of temporal context on the MT response and thereby results in the representation of acceleration signals.
Kaminiarz A, Schlack A, Hoffmann KP, Lappe M, Bremmer F. Visual selectivity for heading in the macaque ventral intraparietal area. J Neurophysiol 112: 2470 -2480, 2014. First published August 13, 2014 doi:10.1152/jn.00410.2014.-The patterns of optic flow seen during self-motion can be used to determine the direction of one's own heading. Tracking eye movements which typically occur during everyday life alter this task since they add further retinal image motion and (predictably) distort the retinal flow pattern. Humans employ both visual and nonvisual (extraretinal) information to solve a heading task in such case. Likewise, it has been shown that neurons in the monkey medial superior temporal area (area MST) use both signals during the processing of self-motion information. In this article we report that neurons in the macaque ventral intraparietal area (area VIP) use visual information derived from the distorted flow patterns to encode heading during (simulated) eye movements. We recorded responses of VIP neurons to simple radial flow fields and to distorted flow fields that simulated self-motion plus eye movements. In 59% of the cases, cell responses compensated for the distortion and kept the same heading selectivity irrespective of different simulated eye movements. In addition, response modulations during real compared with simulated eye movements were smaller, being consistent with reafferent signaling involved in the processing of the visual consequences of eye movements in area VIP. We conclude that the motion selectivities found in area VIP, like those in area MST, provide a way to successfully analyze and use flow fields during self-motion and simultaneous tracking movements.self-motion; primate; parietal cortex; eye movements SELF-MOTION THROUGH AN ENVIRONMENT induces visual, vestibular, tactile, and auditory signals. Neurophysiological research over the last two decades has shown in the animal model, i.e., the macaque monkey, how these signals interact to enhance and disambiguate the perception of heading during self-motion. Two areas of the primate extrastriate and parietal cortex proved to be of specific importance in this context, i.e., the medial superior temporal area (area MST) and the ventral intraparietal area (area VIP). Neurons in area MST respond to visual and vestibular self-motion signals, and their causal role in heading perception has been confirmed (Bremmer et al.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.