Despite wide recognition that a moving object is perceived to last longer, scientists do not yet agree as to how this illusion occurs. In the present study, we conducted two experiments using two experimental methods, namely duration matching and reproduction, and systematically manipulated the temporal frequency, spatial frequency, and speed of the stimulus, to identify the determinant factor of the illusion. Our results indicated that the speed of the stimulus, rather than temporal frequency or spatial frequency per se, best described the perceived duration of a moving stimulus, with the apparent duration proportionally increasing with log speed (Experiments 1 and 2). However, in an additional experiment, we found little or no change in onset and offset reaction times for moving stimuli (Experiment 3). Arguing that speed information is made explicit in higher stages of visual information processing in the brain, we suggest that this illusion is primarily mediated by higher level motion processing stages in the dorsal pathway.
Recent neuroscience studies have been concerned with how aimed movements are generated on the basis of target localization. However, visual information from the surroundings as well as from the target can influence arm motor control, in a manner similar to known effects in postural and ocular motor control. Here, we show an ultra-fast manual motor response directly induced by a large-field visual motion. This rapid response aided reaction when the subject moved his hand in the direction of visual motion, suggesting assistive visually evoked manual control during postural movement. The latency of muscle activity generating this response was as short as that of the ocular following responses to the visual motion. Abrupt visual motion entrained arm movement without affecting perceptual target localization, and the degrees of motion coherence and speed of the visual stimulus modulated this arm response. This visuomotor behavior was still observed when the visual motion was confined to the "follow-through" phase of a hitting movement, in which no target existed. An analysis of the arm movements suggests that the hitting follow through made by the subject is not a part of a reaching movement. Moreover, the arm response was systematically modulated by hand bias forces, suggesting that it results from a reflexive control mechanism. We therefore propose that its mechanism is radically distinct from motor control for aimed movements to a target. Rather, in an analogy with reflexive eye movement stabilizing a retinal image, we consider that this mechanism regulates arm movements in parallel with voluntary motor control.
A flash that is presented adjacent to a continuously moving bar is perceived to lag behind the bar. One explanation for this phenomenon is that there is a difference in the persistence of the flash and the bar. Another explanation is that the visual system compensates for the neural delays of processing visual motion information, such as the moving bar, by spatially extrapolating the bar's perceived location forward in space along its expected trajectory. Two experiments demonstrate that neither of these models is tenable. The first experiment masked the flash one video frame after its presentation. The flash was still perceived to lag behind the bar, suggesting that a difference in the persistence of the flash and bar, does not cause the apparent offset. The second experiment employed unpredictable changes in the velocity of the bar including an abrupt reversal, disappearance, acceleration, and deceleration. If the extrapolation model held, the bar would continue to be extrapolated in accordance with its initial velocity until the moment of an abrupt velocity change. The results were inconsistent with this prediction, suggesting that there is little or no spatial compensation for the neural delays of processing moving objects. The results support a new model of temporal facilitation for moving objects whereby the apparent flash lag is due to a latency advantage for moving over flashed stimuli.
A stationary pattern with asymmetrical luminance gradients can appear to move. We hypothesized that the source signal of this illusion originates in retinal image motions due to fixational eye movements. We investigated the inter-subject correlation between fixation instability and illusion strength. First, we demonstrated that the strength of the illusion can be quantified by the nulling technique. Second, we concurrently measured cancellation velocity and fixation instability for each subject, and found a positive correlation between them. The same relationship was also found within a single observer when the visual stimulus was artificially moved in the simulation of fixation instability. Third, we confirmed the same correlation with eye movements for a wider variety of illusory displays. These results suggest that fixational eye movements indeed play a relevant role in generating this motion illusion.
In the inferior temporal (IT) cortex of monkeys, which has been shown to play a critical role in colour discrimination, there are neurons sensitive to a narrow range of hues and saturation. By contrast, neurons in the retina and the parvocellular layer of the lateral geniculate nucleus (pLGN) encode colours in a way that does not provide explicit representation of hue or saturation, and the process by which hue- and saturation-selectivity is elaborated remains unknown. We therefore tested the colour-selectivity of neurons in the primary visual cortex (V1) and compared it with those of pLGN and IT neurons. Quantitative analysis was performed using a standard set of colours, systematically distributed within the CIE (Commission Internationale de l'Eclairage)-xy chromaticity diagram. Selectivity for hue and saturation was characterized by analysing response contours reflecting the overall distribution of responses across the chromaticity diagram. We found that the response contours of almost all pLGN neurons were linear and broadly tuned for hue. Many V1 neurons behaved similarly; nonetheless, a considerable number of V1 neurons had clearly curved response contours and were selective for a narrow range of hues or saturation. The relative frequencies of neurons exhibiting various selectivities for hue and saturation were remarkably similar in the V1 and IT cortex, but were clearly different in the pLGN. Thus, V1 apparently plays a very important role in the conversion of colour signals necessary for generating the elaborate colour selectivity observed in the IT cortex.
The flash-lag effect refers to the phenomenon in which a flash adjacent to a continuously moving object is perceived to lag behind it. To test three previously proposed hypotheses (motion extrapolation, positional averaging, and differential latency), a new stimulus configuration, to which the three hypotheses give different predictions, was introduced. Instead of continuous motion, a randomly jumping bar was used as the moving stimulus, relative to which the position of the flash was judged. The results were visualized as a spatiotemporal correlogram, in which the response to a flash was plotted at the space-time relative to the position and onset of the jumping bar. The actual human performance was not consistent with any of the original hypotheses. However, all the results were explained well if the differential latency was assumed to fluctuate considerably, its probability density function being approximated by Gaussian. Also, the model fit well with previously published data on the flash-lag effect.
A shaky hand holding a video camera invariably turns a treasured moment into an annoying, jittery momento. More recent consumer cameras thoughtfully offer stabilization mechanisms to compensate for our unsteady grip. Our eyes face a similar challenge in that they are constantly making small movements even when we try to maintain a fixed gaze. What should be substantial, distracting jitter passes completely unseen. Position changes from large eye movements (saccades) seem to be corrected on the basis of extraretinal signals such as the motor commands sent to the eye muscle, and the resulting motion responses seem to be simply switched off. But this approach is impracticable for incessant, small displacements, and here we describe a novel visual illusion that reveals a compensation mechanism based on visual motion signals. Observers were adapted to a patch of dynamic random noise and then viewed a larger pattern of static random noise. The static noise in the unadapted regions then appeared to 'jitter' coherently in random directions. Several observations indicate that this visual jitter directly reflects fixational eye movements. We propose a model that accounts for this illusion as well as the stability of the visual world during small and/or slow eye movements such as fixational drift, smooth pursuit and low-amplitude mechanical vibrations of the eyes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.