Despite 2 centuries of research, the question of whether attending to a sensory modality speeds the perception of stimuli in that modality has yet to be resolved. The authors highlight weaknesses inherent in this previous research and report the results of 4 experiments in which a novel methodology was used to investigate the effects on temporal order judgments (TOJs) of attending to a particular sensory modality or spatial location. Participants were presented with pairs of visual and tactile stimuli from the left and/or right at varying stimulus onset asynchronies and were required to make unspeeded TOJs regarding which stimulus appeared first. The results provide the strongest evidence to date for the existence of multisensory prior entry and support previous claims for attentional biases toward the visual modality and toward the right side of space. These findings have important implications for studies in many areas of human and animal cognition.
It has long been claimed that attended stimuli are perceived prior to unattended stimuli--doctrine of prior entry. Most, if not all, studies on which such claims have been based, however, are open to a nonattentional interpretation involving response bias, leading some researchers to assert that prior entry may not exist. Given this controversy, we introduce a novel methodology to minimize the effect of response bias by manipulating attention and response demands in orthogonal dimensions. Attention was oriented to the left or right (ie., spatially), but instead of reporting on the basis of location, observers reported the order (first or second) of vertical versus horizontal line segments. Although second-order response biases were demonstrated, effects of attention in accordance with the law of prior entry were clearly obtained following both exogenous and endogenous attentional cuing.
The relative spatiotemporal correspondence between sensory events affects multisensory integration across a variety of species; integration is maximal when stimuli in different sensory modalities are presented from approximately the same position at about the same time. In the present study, we investigated the influence of spatial and temporal factors on audio-visual simultaneity perception in humans. Participants made unspeeded simultaneous versus successive discrimination responses to pairs of auditory and visual stimuli presented at varying stimulus onset asynchronies from either the same or different spatial positions using either the method of constant stimuli (Experiments 1 and 2) or psychophysical staircases (Experiment 3). The participants in all three experiments were more likely to report the stimuli as being simultaneous when they originated from the same spatial position than when they came from different positions, demonstrating that the apparent perception of multisensory simultaneity is dependent on the relative spatial position from which stimuli are presented.
In the collection, analysis and interpretation of any test data, psychometric properties, such as those reported here for the ANT, must be carefully considered.
Using functional magnetic resonance imaging (fMRI) in humans, we identified regions of cortex involved in the encoding of limb position. Tactile stimulation of the right hand, across the body midline, activated the right parietal cortex when the eyes were closed; activation shifted to a left parietofrontal network when the eyes were open. These data reveal important similarities between human and non-human primates in the network of brain areas involved in the multisensory representation of limb position.
In two experiments, we examined the extent to which audiovisual temporal order judgments (TOJs) were affected by spatial factors and by the dimension along which TOJs were made. Pairs of auditory and visual stimuli were presented from either the left and/or right of fixation at varying stimulus onset asynchronies (SOAs), and participants made unspeeded TOJs regarding either "Which modality was presented first?" (experiment 1), or "Which side was presented first?" (experiment 2). Modality TOJs were more accurate (i.e. just-noticeable differences, JNDs, were smaller) when the auditory and visual stimuli were presented from different spatial positions rather than from the same position, highlighting an important potential confound inherent in previous research. By contrast, spatial TOJs were unaffected by whether or not the two stimuli were presented in different modalities. A between-experiments comparison revealed more accurate performance (i.e. smaller JNDs) when people reported which modality came first than when they reported which side came first for identical bimodal stimulus pairs. These results demonstrate that multisensory TOJs are critically dependent on both the relative spatial position from which stimuli are presented and on the particular dimension being judged.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.