In human adults, visual dominance emerges in several multisensory tasks. In children, auditory dominance has been reported up to 4 years of age. To establish when sensory dominance changes during development, 41 children (6-7, 9-10, and 11-12 years) were tested on the Colavita task (Experiment 1) and 32 children (6-7, 9-10, and 11-12 years) were tested on the sound-induced flash illusion (Experiment 2). In both experiments, an auditory dominance emerged in 6- to 7-year-old children compared to older children. Adult-like visual dominance started to emerge from 9 to 10 years of age, and consolidated in 11- to 12-year-old children. These findings show that auditory dominance persists up to 6 years, but switches to visual dominance during the first school years.
Vision of the body is known to affect somatosensory perception (e.g. proprioception or tactile discrimination). However, it is unknown whether visual information about one's own body size can influence bodily action. We tested this by measuring the maximum grip aperture (MGA) parameter of grasping while eight subjects viewed a real size, enlarged or shrunken image of their hand reaching to grasp a cylinder. In the enlarged view condition, the MGA decreased relative to real size view, as if the grasping movement was actually executed with a physically larger hand, thus requiring a smaller grip aperture to grasp the cylinder. Interestingly, MGA remained smaller even after visual feedback was removed. In contrast, no effect was found for the reduced view condition. This asymmetry may reflect the fact that enlargement of body parts is experienced more frequently than shrinkage, notably during normal growth. In conclusion, vision of the body can significantly and persistently affect the internal model of the body used for motor programming.
In this study, we investigated the contribution of tactile and proprioceptive cues to the development of the sense of body ownership by testing the susceptibility of 4- to 5-year-old children, 8- to 9-year-old children, and adults to the somatic rubber-hand illusion (SRHI). We found that feelings of owning a rubber hand in the SHRI paradigm, as assessed by explicit reports (i.e., questionnaire), are already present by age 4 and do not change throughout development. In contrast, the effect of the illusion on the sense of hand position, as assessed by a pointing task, was present only in 8- to 9-year-old children and adults; the magnitude of such capture increased with age. Our findings reveal that tactile-proprioceptive interactions contributed differently to the two aspects characterizing the SRHI: Although the contribution of such interactions to an explicit sense of self was similar across age groups, their contribution to the more implicit recalibration of hand position is still developing by age 9.
We investigated temporal processing in profoundly deaf individuals by testing their ability to make temporal order judgments (TOJs) for pairs of visual stimuli presented at central or peripheral visual eccentricities. Ten profoundly deaf participants judged which of the two visual stimuli appearing on opposite sides of central fixation was delivered first. Stimuli were presented either symmetrically, at central or peripheral locations, or asymmetrically (i.e. one central and the other peripheral) at varying stimulus onset asynchronies (SOAs) using the method of constant stimuli. Two groups of hearing controls were also tested in this task: 10 hearing controls auditory-deprived during testing and 12 hearing controls who were not subjected to any deprivation procedure. Temporal order thresholds (i.e. just noticeable differences) and points of subjective simultaneity for the two visual stimuli did not differ between groups. However, faster discrimination responses were systematically observed in the deaf than in either group of hearing controls, especially when the first of the two stimuli appeared at peripheral locations. Contrary to some previous findings, our results show that a life-long auditory deprivation does not alter temporal processing abilities in the millisecond range. In fact, we show that deaf participants obtain similar temporal thresholds to hearing controls, while also responding much faster. This enhanced reactivity is documented here for the first time in the context of a temporal processing task, and we suggest it may constitute a critical aspect of the functional changes occurring as a consequence of profound deafness.
Interest in crossmodal correspondences has recently seen a renaissance thanks to numerous studies in human adults. Yet, still very little is known about crossmodal correspondences in children, particularly in sensory pairings other than audition and vision. In the current study, we investigated whether 4–5-year-old children match auditory pitch to the spatial motion of visual objects (audio-visual condition). In addition, we investigated whether this correspondence extends to touch, i.e., whether children also match auditory pitch to the spatial motion of touch (audio-tactile condition) and the spatial motion of visual objects to touch (visuo-tactile condition). In two experiments, two different groups of children were asked to indicate which of two stimuli fitted best with a centrally located third stimulus (Experiment 1), or to report whether two presented stimuli fitted together well (Experiment 2). We found sensitivity to the congruency of all of the sensory pairings only in Experiment 2, suggesting that only under specific circumstances can these correspondences be observed. Our results suggest that pitch–height correspondences for audio-visual and audio-tactile combinations may still be weak in preschool children, and speculate that this could be due to immature linguistic and auditory cues that are still developing at age five.
Several studies conducted in mammals and humans have shown that multisensory processing may be impaired following congenital sensory loss and in particular if no experience is achieved within specific early developmental time windows known as sensitive periods. In this study we investigated whether basic multisensory abilities are impaired in hearing-restored individuals with deafness acquired at different stages of development. To this aim, we tested congenitally and late deaf cochlear implant (CI) recipients, age-matched with two groups of hearing controls, on an audio-tactile redundancy paradigm, in which reaction times to unimodal and crossmodal redundant signals were measured. Our results showed that both congenitally and late deaf CI recipients were able to integrate audio-tactile stimuli, suggesting that congenital and acquired deafness does not prevent the development and recovery of basic multisensory processing. However, we found that congenitally deaf CI recipients had a lower multisensory gain compared to their matched controls, which may be explained by their faster responses to tactile stimuli. We discuss this finding in the context of reorganisation of the sensory systems following sensory loss and the possibility that these changes cannot be “rewired” through auditory reafferentation.
Sense of body ownership and body representation are fundamental parts of human consciousness, but the contribution of the visual modality to their development remains unclear. We tested congenitally and late blind adults on a somatosensory version of the rubber hand illusion, and on the Aristotle illusion, in which sighted controls touching a single sphere with crossed fingers commonly report perceiving two. We found that congenitally and late blind individuals did not report subjectively experiencing the rubber hand illusion. However, in an objective measure, the congenitally blind did not show a recalibration of the position of their hand towards the rubber hand while late blind and sighted individuals did. By contrast, all groups experienced the Aristotle illusion. This pattern of results provides evidence for a dissociation of the concepts of body ownership and spatial recalibration and, furthermore, suggests different reference frames for hands (external space) and fingers (anatomical space).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.