Perceptual phenomena that occur around the time of a saccade, such as peri-saccadic mislocalization or saccadic suppression of displacement, have often been linked to mechanisms of spatial stability. These phenomena are usually regarded as errors in processes of trans-saccadic spatial transformations and they provide important tools to study these processes. However, a true understanding of the underlying brain processes that participate in the preparation for a saccade and in the transfer of information across it requires a closer, more quantitative approach that links different perceptual phenomena with each other and with the functional requirements of ensuring spatial stability. We review a number of computational models of peri-saccadic spatial perception that provide steps in that direction. Although most models are concerned with only specific phenomena, some generalization and interconnection between them can be obtained from a comparison. Our analysis shows how different perceptual effects can coherently be brought together and linked back to neuronal mechanisms on the way to explaining vision across saccades.
Spatial perception, the localization of stimuli in space, can rely on visual reference stimuli or on egocentric factors such as a stimulus position relative to eye gaze. In total darkness, only an egocentric reference frame provides sufficient information. When stimuli are briefly flashed around saccades, the localization error reveals potential mechanisms of updating such reference frames as described in several theories and computational models. Recent novel experimental evidence, however, showed that the maximum amount of mislocalization does not scale linearly with saccade amplitude but rather stays below 13°even for long saccades, which is different from predicted by present models. We proposeanewmodelofperisaccadicmislocalizationincompletedarknesstoaccountforthisobservation.Accordingtothismodel,mislocalizationarises not on the motor side by comparing a retinal position signal with an extraretinal eye position related signal but by updating stimulus position in visual areas through a combination of proprioceptive eye position and corollary discharge. Simulations with realistic input signals and temporal dynamics show that both signals together are used for spatial updating and in turn bring about perisaccadic mislocalization.
The understanding of the subjective experience of a visually stable world despite the occurrence of an observer's eye movements has been the focus of extensive research for over 20 years. These studies have revealed fundamental mechanisms such as anticipatory receptive field (RF) shifts and the saccadic suppression of stimulus displacements, yet there currently exists no single explanatory framework for these observations. We show that a previously presented neuro-computational model of peri-saccadic mislocalization accounts for the phenomenon of predictive remapping and for the observation of saccadic suppression of displacement (SSD). This converging evidence allows us to identify the potential ingredients of perceptual stability that generalize beyond different data sets in a formal physiology-based model. In particular we propose that predictive remapping stabilizes the visual world across saccades by introducing a feedback loop and, as an emergent result, small displacements of stimuli are not noticed by the visual system. The model provides a link from neural dynamics, to neural mechanism and finally to behavior, and thus offers a testable comprehensive framework of visual stability.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.