When we look at our hands, we immediately know that they are part of our own body. This feeling of ownership of our limbs is a fundamental aspect of self-consciousness. We have studied the neuronal counterparts of this experience. A perceptual illusion was used to manipulate feelings of ownership of a rubber hand presented in front of healthy subjects while brain activity was measured by functional magnetic resonance imaging. The neural activity in the premotor cortex reflected the feeling of ownership of the hand. This suggests that multisensory integration in the premotor cortex provides a mechanism for bodily self-attribution.
In many everyday situations, our senses are bombarded by many different unisensory signals at any given time. To gain the most veridical, and least variable, estimate of environmental stimuli/properties, we need to combine the individual noisy unisensory perceptual estimates that refer to the same object, while keeping those estimates belonging to different objects or events separate. How, though, does the brain "know" which stimuli to combine? Traditionally, researchers interested in the crossmodal binding problem have focused on the roles that spatial and temporal factors play in modulating multisensory integration. However, crossmodal correspondences between various unisensory features (such as between auditory pitch and visual size) may provide yet another important means of constraining the crossmodal binding problem. A large body of research now shows that people exhibit consistent crossmodal correspondences between many stimulus features in different sensory modalities. For example, people consistently match high-pitched sounds with small, bright objects that are located high up in space. The literature reviewed here supports the view that crossmodal correspondences need to be considered alongside semantic and spatiotemporal congruency, among the key constraints that help our brains solve the crossmodal binding problem.
The sense of body ownership represents a fundamental aspect of our self-awareness, but is disrupted in many neurological, psychiatric, and psychological conditions that are also characterized by disruption of skin temperature regulation, sometimes in a single limb. We hypothesized that skin temperature in a specific limb could be disrupted by psychologically disrupting the sense of ownership of that limb. In six separate experiments, and by using an established protocol to induce the rubber hand illusion, we demonstrate that skin temperature of the real hand decreases when we take ownership of an artificial counterpart. The decrease in skin temperature is limb-specific: it does not occur in the unstimulated hand, nor in the ipsilateral foot. The effect is not evoked by tactile or visual input per se, nor by simultaneous tactile and visual input per se, nor by a shift in attention toward the experimental side or limb. In fact, taking ownership of an artificial hand slows tactile processing of information from the real hand, which is also observed in patients who demonstrate body disownership after stroke. These findings of psychologically induced limb-specific disruption of temperature regulation provide the first evidence that: taking ownership of an artificial body part has consequences for the real body part; that the awareness of our physical self and the physiological regulation of self are closely linked in a top-down manner; and that cognitive processes that disrupt the sense of body ownership may in turn disrupt temperature regulation in numerous states characterized by both.body image ͉ consciousness ͉ crossmodal integration ͉ homeostasis
When the apparent visual location of a body part conflicts with its veridical location, vision can dominate proprioception and kinesthesia. In this article, we show that vision can capture tactile localization. Participants discriminated the location of vibrotactile stimuli (upper, at the index finger, vs. lower, at the thumb), while ignoring distractor lights that could independently be upper or lower. Such tactile discriminations were slowed when the distractor light was incongruent with the tactile target (e.g., an upper light during lower touch) rather than congruent, especially when the lights appeared near the stimulated hand. The hands were occluded under a table, with all distractor lights above the table. The effect of the distractor lights increased when rubber hands were placed on the table, "holding" the distractor lights, but only when the rubber hands were spatially aligned with the participant's own hands. In this aligned situation, participants were more likely to report the illusion of feeling touch at the rubber hands. Such visual capture of touch appears cognitively impenetrable.
Despite 2 centuries of research, the question of whether attending to a sensory modality speeds the perception of stimuli in that modality has yet to be resolved. The authors highlight weaknesses inherent in this previous research and report the results of 4 experiments in which a novel methodology was used to investigate the effects on temporal order judgments (TOJs) of attending to a particular sensory modality or spatial location. Participants were presented with pairs of visual and tactile stimuli from the left and/or right at varying stimulus onset asynchronies and were required to make unspeeded TOJs regarding which stimulus appeared first. The results provide the strongest evidence to date for the existence of multisensory prior entry and support previous claims for attentional biases toward the visual modality and toward the right side of space. These findings have important implications for studies in many areas of human and animal cognition.
Covert orienting in hearing was examined by presenting auditory spatial cues prior to an auditory target, requiring either a choice or detection response. Targets and cues appeared on the left or right of Ss' midline. Localization of the target in orthogonal directions (up vs. down or front vs. back, independent of target side) was faster when cue and target appeared on the same rather than opposite sides. This benefit was larger and more durable when the cue predicted target side. These effects cannot reflect criterion shifts, suggesting that covert orienting enhances auditory localization. Fine frequency discriminations also benefited from predictive spatial cues, although uninformative cues only affected spatial discriminations. No cuing effects were observed in a detection task.This research was supported by grants from the Medical Research Council (England). We thank Philip Quinlan for stimulating discussions on these experiments and on his own related work with
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.