Highlights d Cross-modal decoding of visual and auditory motion directions in hMT + /V5 d Motion-direction representation is, however, not abstracted from the sensory input d We reveal a multifaced representation of multisensory motion signals in hMT + /V5
In the present review, we focus on how commonalities in the ontogenetic development of the auditory and tactile sensory systems may inform the interplay between these signals in the temporal domain. In particular, we describe the results of behavioral studies that have investigated temporal resolution (in temporal order, synchrony/asynchrony, and simultaneity judgment tasks), as well as temporal numerosity perception, and similarities in the perception of frequency across touch and hearing. The evidence reviewed here highlights features of audiotactile temporal perception that are distinctive from those seen for other pairings of sensory modalities. For instance, audiotactile interactions are characterized in certain tasks (e.g., temporal numerosity judgments) by a more balanced reciprocal influence than are other modality pairings. Moreover, relative spatial position plays a different role in the temporal order and temporal recalibration processes for audiotactile stimulus pairings than for other modality pairings. The effect exerted by both the spatial arrangement of stimuli and attention on temporal order judgments is described. Moreover, a number of audiotactile interactions occurring during sensory-motor synchronization are highlighted. We also look at the audiotactile perception of rhythm and how it may be affected by musical training. The differences emerging from this body of research highlight the need for more extensive investigation into audiotactile temporal interactions. We conclude with a brief overview of some of the key issues deserving of further research in this area.Keywords Auditory . Tactile . Temporal . Frequency . Audiotactile . Crossmodal similarities . Mechanoreception . MultisensoryThe boundaries between hearing and touch: the foundation of an analogyWe continuously interact with environments that provide a large amount of multisensory information to our various senses. Researchers have now convincingly demonstrated that the inputs delivered by the different sensory channels tend to be bound together by the brain (see the section Research on hearing and touch: a multisensory perspective for a fuller discussion of this topic). Unlike the audiovisual and visuotactile sensory pairings, those interactions taking place at both the neuronal and behavioral level between audition and touch have, to date, been explored in far less detail (see Kitagawa & Spence, 2006;Soto-Faraco & Deco, 2009, for reviews of the extant literature). The paucity of research covering this modality pairing is rather surprising when one considers the wide range of everyday situations in which we experience-even though often in subtle and unconscious ways-the interplay between these two senses. Examples include perceiving the "auditory" buzzing and the itchy "tactile" sensation of an insect landing on the back of our neck; reaching for a mobile phone ringing and vibrating in our pocket. What is common to these situations is the exclusive-or, at the very least, predominant-reliance on cues provided by the
Participants made speeded discrimination responses to unimodal auditory (low-frequency vs. high-frequency sounds) or vibrotactile stimuli (presented to the index finger, upper location vs. to the thumb, lower location). In the compatible blocks of trials, the implicitly related stimuli (i.e. higher-frequency sounds and upper tactile stimuli; and the lower-frequency sounds and the lower tactile stimuli) were associated with the same response key; in the incompatible blocks, weakly related stimuli (i.e. high-frequency sounds and lower tactile stimuli; and the low-frequency sounds and the upper tactile stimuli) were associated with the same response key. Better performance was observed in the compatible (vs. incompatible) blocks, thus providing empirical support for the cross-modal association between the relative frequency of a sound and the relative elevation of a tactile stimulus.
It has been reported that people tend to preferentially associate phonemes like /m/, /l/, /n/ to curvilinear shapes and phonemes like /t/, /z/, /r/, /k/ to rectilinear shapes. Here we evaluated the performance of children/adolescents with autism spectrum disorders (ASD) and neurotypical controls in this audiovisual congruency phenomenon. Pairs of visual patterns (curvilinear vs rectilinear) were presented to a group of ASD participants (low- or high-functioning) and a group of age-matched neurotypical controls. Participants were asked to associate each item to non-meaningful phoneme clusters. ASD participants showed a lower proportion of expected association responses than the controls. Within the ASD group the performance varied as a function of the severity of the symptomatology. These data suggest that children/adolescents with ASD show, although at different degrees as a function of the severity of the ASD, lower phonetic-iconic congruency response patterns than neurotypical controls, pointing to poorer multisensory integration capabilities.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.