To perceive the auditory and visual aspects of a physical event as occurring simultaneously, the brain must adjust for differences between the two modalities in both physical transmission time and sensory processing time. One possible strategy to overcome this difficulty is to adaptively recalibrate the simultaneity point from daily experience of audiovisual events. Here we report that after exposure to a fixed audiovisual time lag for several minutes, human participants showed shifts in their subjective simultaneity responses toward that particular lag. This 'lag adaptation' also altered the temporal tuning of an auditory-induced visual illusion, suggesting that adaptation occurred via changes in sensory processing, rather than as a result of a cognitive shift while making task responses. Our findings suggest that the brain attempts to adjust subjective simultaneity across different modalities by detecting and reducing time lags between inputs that likely arise from the same physical events.
No abstract
The world is full of surfaces, and by looking at them we can judge their material qualities. Properties such as colour or glossiness can help us decide whether a pancake is cooked, or a patch of pavement is icy. Most studies of surface appearance have emphasized textureless matte surfaces, but real-world surfaces, which may have gloss and complex mesostructure, are now receiving increased attention. Their appearance results from a complex interplay of illumination, reflectance and surface geometry, which are difficult to tease apart given an image. If there were simple image statistics that were diagnostic of surface properties it would be sensible to use them. Here we show that the skewness of the luminance histogram and the skewness of sub-band filter outputs are correlated with surface gloss and inversely correlated with surface albedo (diffuse reflectance). We find evidence that human observers use skewness, or a similar measure of histogram asymmetry, in making judgements about surfaces. When the image of a surface has positively skewed statistics, it tends to appear darker and glossier than a similar surface with lower skewness, and this is true whether the skewness is inherent to the original image or is introduced by digital manipulation. We also find a visual after-effect based on skewness: adaptation to patterns with skewed statistics can alter the apparent lightness and glossiness of surfaces that are subsequently viewed. We suggest that there are neural mechanisms sensitive to skewed statistics, and that their outputs can be used in estimating surface properties.
A fundamental question about the perception of time is whether the neural mechanisms underlying temporal judgements are universal and centralized in the brain or modality specific and distributed. Time perception has traditionally been thought to be entirely dissociated from spatial vision. Here we show that the apparent duration of a dynamic stimulus can be manipulated in a local region of visual space by adapting to oscillatory motion or flicker. This implicates spatially localized temporal mechanisms in duration perception. We do not see concomitant changes in the time of onset or offset of the test patterns, demonstrating a direct local effect on duration perception rather than an indirect effect on the time course of neural processing. The effects of adaptation on duration perception can also be dissociated from motion or flicker perception per se. Although 20 Hz adaptation reduces both the apparent temporal frequency and duration of a 10 Hz test stimulus, 5 Hz adaptation increases apparent temporal frequency but has little effect on duration perception. We conclude that there is a peripheral, spatially localized, essentially visual component involved in sensing the duration of visual events.
We propose that the perception of the relative time of events is based on the relationship of representations of temporal pattern that we term time markers. We conclude that the perceptual asynchrony effects studied here do not reflect differential neural delays for different attributes; rather, they arise from a faulty correspondence match between color transitions and position transitions (motion), which in turn results from a difficulty in detecting turning points (direction reversals) and a preference for matching markers of the same type.
No abstract
Recent neuroscience studies have been concerned with how aimed movements are generated on the basis of target localization. However, visual information from the surroundings as well as from the target can influence arm motor control, in a manner similar to known effects in postural and ocular motor control. Here, we show an ultra-fast manual motor response directly induced by a large-field visual motion. This rapid response aided reaction when the subject moved his hand in the direction of visual motion, suggesting assistive visually evoked manual control during postural movement. The latency of muscle activity generating this response was as short as that of the ocular following responses to the visual motion. Abrupt visual motion entrained arm movement without affecting perceptual target localization, and the degrees of motion coherence and speed of the visual stimulus modulated this arm response. This visuomotor behavior was still observed when the visual motion was confined to the "follow-through" phase of a hitting movement, in which no target existed. An analysis of the arm movements suggests that the hitting follow through made by the subject is not a part of a reaching movement. Moreover, the arm response was systematically modulated by hand bias forces, suggesting that it results from a reflexive control mechanism. We therefore propose that its mechanism is radically distinct from motor control for aimed movements to a target. Rather, in an analogy with reflexive eye movement stabilizing a retinal image, we consider that this mechanism regulates arm movements in parallel with voluntary motor control.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.