The integration of information from different sensory modalities is crucial for successful navigation through an environment. Among others, self-motion induces distinct optic flow patterns on the retina, vestibular signals and tactile flow, which contribute to determine traveled distance (path integration) or movement direction (heading). While the processing of combined visual–vestibular information is subject to a growing body of literature, the processing of visuo-tactile signals in the context of self-motion has received comparatively little attention. Here, we investigated whether visual heading perception is influenced by behaviorally irrelevant tactile flow. In the visual modality, we simulated an observer’s self-motion across a horizontal ground plane (optic flow). Tactile self-motion stimuli were delivered by air flow from head-mounted nozzles (tactile flow). In blocks of trials, we presented only visual or tactile stimuli and subjects had to report their perceived heading. In another block of trials, tactile and visual stimuli were presented simultaneously, with the tactile flow within ±40° of the visual heading (bimodal condition). Here, importantly, participants had to report their perceived visual heading. Perceived self-motion direction in all conditions revealed a centripetal bias, i.e., heading directions were perceived as compressed toward straight ahead. In the bimodal condition, we found a small but systematic influence of task-irrelevant tactile flow on visually perceived headings as function of their directional offset. We conclude that tactile flow is more tightly linked to self-motion perception than previously thought.
Self-motion through an environment induces various sensory signals, i.e., visual, vestibular, auditory, or tactile. Numerous studies have investigated the role of visual and vestibular stimulation for the perception of self-motion direction (heading). Here, we investigated the rarely considered interaction of visual and tactile stimuli in heading perception. Participants were presented optic flow simulating forward self-motion across a horizontal ground plane (visual) or airflow towards the participants' forehead (tactile), or both. In separate blocks of trials, participants indicated perceived heading from unimodal visual, tactile, or bimodal sensory signals. In bimodal trials, presented headings were either spatially congruent or incongruent with a maximum offset between visual and tactile heading of 30°. To investigate the reference frame in which visuo-tactile heading is encoded, we varied head and eye orientation during presentation of the stimuli. Visual and tactile stimuli were designed to achieve comparable heading accuracies between modalities. Nevertheless, in bimodal trials, heading perception was dominated by the visual stimulus. A change of head orientation had no significant effect on perceived heading, while, surprisingly, a change in eye orientation affected tactile heading perception. Overall, we conclude that tactile flow is more important to heading perception than previously thought.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.