A fundamental question about the perception of time is whether the neural mechanisms underlying temporal judgements are universal and centralized in the brain or modality specific and distributed. Time perception has traditionally been thought to be entirely dissociated from spatial vision. Here we show that the apparent duration of a dynamic stimulus can be manipulated in a local region of visual space by adapting to oscillatory motion or flicker. This implicates spatially localized temporal mechanisms in duration perception. We do not see concomitant changes in the time of onset or offset of the test patterns, demonstrating a direct local effect on duration perception rather than an indirect effect on the time course of neural processing. The effects of adaptation on duration perception can also be dissociated from motion or flicker perception per se. Although 20 Hz adaptation reduces both the apparent temporal frequency and duration of a 10 Hz test stimulus, 5 Hz adaptation increases apparent temporal frequency but has little effect on duration perception. We conclude that there is a peripheral, spatially localized, essentially visual component involved in sensing the duration of visual events.
This is the accepted version of the paper.This version of the publication may differ from the final published version. Permanent repository link
Humans intuitively evaluate their decisions by forming different levels of confidence. Despite being highly correlated, decisional confidence and sensitivity can be differentiated. The computational processes underlying this remain unknown. Here we find that, for visual judgments concerning global direction, signal range has a greater impact on confidence than it does sensitivity. We equated sensitivity for stimuli containing different degrees of directional variability. This failed, however, to equate confidence-participants were less confident when judging more variable signals despite constant sensitivity. When stimuli were instead calibrated to equate confidence, participants were more sensitive when judging more variable signals. Directional range had no impact on an unrelated judgment of brightness, helping to establish that these results cannot be attributed to a simple decisional confound. Our complementary results show that directional sensitivity and decisional confidence rely on independent transformations of sensory input. We propose that confidence will generally be shaped by the range of differently tuned neural mechanisms responsive to input during evidence accumulation, with this having a lesser impact on sensitivity. (PsycINFO Database Record
When a unique "oddball" stimulus is embedded in a train of repeated standard stimuli, its duration can seem relatively exaggerated (V. Pariyadath & D. Eagleman, 2007; P. U. Tse, J. Intriligator, J. Rivest, & P. Cavanagh, 2004). We explored the possibility of a link between this and signal intensity reductions at low levels of visual processing. In Experiment 1, we used Troxler fading as a metric of signal intensity-the apparent fading of a stimulus with prolonged viewing (I. P. V. Troxler, 1804). Fading was exaggerated by presenting oddball and standard stimuli to different eyes. However, there was no fading difference when standard stimuli were presented persistently or intermittently. These results contrast with oddball effects, which were insensitive to eye of origin, and which were contingent on intermittent standard stimuli. In Experiment 2, we show that oddball effects can be elicited with oddballs that are less intense versions of repetitive stimuli, and in Experiment 3, we show that oddball effects can scale with the discrepancy between repeated and oddball stimuli. These observations discredit any oddball effect explanation predicated on low-level neural response magnitudes to individual stimuli. Instead, our data support the view that oddball effects are driven by predictive coding (V. Pariyadath & D. Eagleman, 2007), reflecting the discrepancy between expected and actual inputs.
We investigated the effect of adaptation on orientation discrimination using two experienced observers, then replicated the main effects using a total of 50 naïve subjects. Orientation discrimination around vertical improved after adaptation to either horizontal or vertical gratings, but was impaired by adaptation at 7.5 or 15 degrees from vertical. Improvement was greatest when adapter and test were orthogonal. We show that the results can be understood in terms of a functional model of adaptation in cortical vision.
We examined whether the detection of audio-visual temporal synchrony is determined by a pre-attentive parallel process, or by an attentive serial process using a visual search paradigm. We found that detection of a visual target that changed in synchrony with an auditory stimulus was gradually impaired as the number of unsynchronized visual distractors increased (experiment 1), whereas synchrony discrimination of an attended target in a pre-cued location was unaffected by the presence of distractors (experiment 2). The effect of distractors cannot be ascribed to reduced target visibility nor can the increase in false alarm rates be predicted by a noisy parallel processing model. Reaction times for target detection increased linearly with number of distractors, with the slope being about twice as steep for target-absent trials as for targetpresent trials (experiment 3). Similar results were obtained regardless of whether the audio-visual stimulus consisted of visual flashes synchronized with amplitude-modulated pips, or of visual rotations synchronized with frequency-modulated up-down sweeps. All of the results indicate that audio-visual perceptual synchrony is judged by a serial process and are consistent with the suggestion that audio-visual temporal synchrony is detected by a 'mid-level' feature matching process.
Motion contained within a static object can cause illusory position shifts toward the direction of internal motion. Here we present data suggesting this illusion is driven by modulations of apparent contrast. We observe position shifts at blurred stimulus regions without corresponding changes to internal structure, and find that low-contrast targets are more difficult to detect at the trailing, as opposed to leading, edges of movement. Motion induced position shifts are also shown to occur without conscious appreciation of motion direction. Our data suggests that motion can influence spatial coding via interactions that modulate apparent contrast, thereby changing the regions of the stimulus that are visible.
It has been demonstrated that subjects do not report changes in color and direction of motion as being co-incidental when they occur synchronously. Instead, for the changes to be reported as being synchronous, changes in direction of motion must precede changes in color. To explain this observation, some researchers have suggested that the neural processing of color and motion is asynchronous. This interpretation has been criticized on the basis that processing time may not correlate directly and invariantly with perceived time of occurrence. Here we examine this possibility by making use of the color-contingent motion aftereffect. By correlating color states disproportionately with two directions of motion, we produced and measured color-contingent motion aftereffects as a function of the range of physical correlations. The aftereffects observed are consistent with the perceptual correlation between color and motion being different from the physical correlation. These findings demonstrate asynchronous processing for different stimulus attributes, with color being processed more quickly than motion. This suggests that the time course of perceptual experience correlates directly with that of neural activity.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.