To elucidate how gaze informs the construction of mental space during wayfinding in visual species like primates, we jointly examined navigation behavior, visual exploration, and hippocampal activity as macaque monkeys searched a virtual reality maze for a reward. Cells sensitive to place also responded to one or more variables like head direction, point of gaze, or task context. Many cells fired at the sight (and in anticipation) of a single landmark in a viewpoint- or task-dependent manner, simultaneously encoding the animal’s logical situation within a set of actions leading to the goal. Overall, hippocampal activity was best fit by a fine-grained state space comprising current position, view, and action contexts. Our findings indicate that counterparts of rodent place cells in primates embody multidimensional, task-situated knowledge pertaining to the target of gaze, therein supporting self-awareness in the construction of space.
In the jungle, survival is highly correlated with the ability to detect and distinguish between an approaching predator and a putative prey. From an ecological perspective, a predator rapidly approaching its prey is a stronger cue for flight than a slowly moving predator. In the present study, we use functional magnetic resonance imaging in the nonhuman primate, to investigate the neural bases of the prediction of an impact to the body by a looming stimulus, i.e., the neural bases of the interaction between a dynamic visual stimulus approaching the body and its expected consequences onto an independent sensory modality, namely, touch. We identify a core cortical network of occipital, parietal, premotor, and prefrontal areas maximally activated by tactile stimulations presented at the predicted time and location of impact of the looming stimulus on the faces compared with the activations observed for spatially or temporally incongruent tactile and dynamic visual cues. These activations reflect both an active integration of visual and tactile information and of spatial and temporal prediction information. The identified cortical network coincides with a well described multisensory visuotactile convergence and integration network suggested to play a key role in the definition of peripersonal space. These observations are discussed in the context of multisensory integration and spatial, temporal prediction and Bayesian causal inference. Looming stimuli have a particular ecological relevance as they are expected to come into contact with the body, evoking touch or pain sensations and possibly triggering an approach or escape behavior depending on their identity. Here, we identify the nonhuman primate functional network that is maximally activated by tactile stimulations presented at the predicted time and location of impact of the looming stimulus. Our findings suggest that the integration of spatial and temporal predictive cues possibly rely on the same neural mechanisms that are involved in multisensory integration.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.