Humans are well able to operate tools whereby their hand movement is linked, via a kinematic transformation, to a spatially distant object moving in a separate plane of motion. An everyday example is controlling a cursor on a computer monitor. Despite these separate reference frames, the perceived positions of the hand and the object were found to be biased toward each other. We propose that this perceptual attraction is based on the principles by which the brain integrates redundant sensory information of single objects or events, known as optimal multisensory integration. That is, ) sensory information about the hand and the tool are weighted according to their relative reliability (i.e., inverse variances), and) the unisensory reliabilities sum up in the integrated estimate. We assessed whether perceptual attraction is consistent with optimal multisensory integration model predictions. We used a cursor-control tool-use task in which we manipulated the relative reliability of the unisensory hand and cursor position estimates. The perceptual biases shifted according to these relative reliabilities, with an additional bias due to contextual factors that were present in but not in The biased position judgments' variances were, however, systematically larger than the predicted optimal variances. Our findings suggest that the perceptual attraction in tool use results from a reliability-based weighting mechanism similar to optimal multisensory integration, but that certain boundary conditions for optimality might not be satisfied. Kinematic tool use is associated with a perceptual attraction between the spatially separated hand and the effective part of the tool. We provide a formal account for this phenomenon, thereby showing that the process behind it is similar to optimal integration of sensory information relating to single objects.
In a basic cursor-control task, the perceived positions of the hand and the cursor are biased towards each other. We recently found that this phenomenon conforms to the reliability-based weighting mechanism of optimal multisensory integration. This indicates that optimal integration is not restricted to sensory signals originating from a single source, as is the prevailing view, but that it also applies to separate objects that are connected by a kinematic relation (i.e. hand and cursor). In the current study, we examined which aspects of the kinematic relation are crucial for eliciting the sensory integration: (i) the cross-correlation between kinematic variables of the hand and cursor trajectories, and/or (ii) an internal model of the hand-cursor kinematic transformation. Participants made out-and-back movements from the centre of a semicircular workspace to its boundary, after which they judged the position where either their hand or the cursor hit the boundary. We analysed the position biases and found that the integration was strong in a condition with high kinematic correlations (a straight hand trajectory was mapped to a straight cursor trajectory), that it was significantly reduced for reduced kinematic correlations (a straight hand trajectory was transformed into a curved cursor trajectory) and that it was not affected by the inability to acquire an internal model of the kinematic transformation (i.e. by the trial-to-trial variability of the cursor curvature). These findings support the idea that correlations play a crucial role in multisensory integration irrespective of the number of sensory sources involved.
The brain needs to identify redundant sensory signals in order to integrate them optimally. The identification process, referred to as causal inference, depends on the spatial and temporal correspondence of the incoming sensory signals (‘online sensory causality evidence’) as well as on prior expectations regarding their causal relation. We here examine whether the same causal inference process underlies spatial integration of actions and their visual consequences. We used a basic cursor-control task for which online sensory causality evidence is provided by the correlated hand and cursor movements, and prior expectations are formed by everyday experience of such correlated movements. Participants made out-and-back movements and subsequently judged the hand or cursor movement endpoints. In one condition, we omitted the online sensory causality evidence by showing the cursor only at the movement endpoint. The integration strength was lower than in conditions where the cursor was visible during the outward movement, but a substantial level of integration persisted. These findings support the hypothesis that the binding of actions and their visual consequences is based on the general mechanism of optimal integration, and they specifically show that such binding can occur even if it is previous experience only that identifies the action consequence.
Spatial proximity enhances the sensory integration of exafferent position information, likely because it indicates whether the information comes from a single physical source. Does spatial proximity also affect the integration of position information regarding an action (here a hand movement) with that of its visual effect (here a cursor motion), that is, when the sensory information comes from physically distinct objects? In this study, participants made out-and-back hand movements whereby the outward movements were accompanied by corresponding cursor motions on a monitor. Their subsequent judgments of hand or cursor movement endpoints are typically biased toward each other, consistent with an underlying optimal integration mechanism. To study the effect of spatial proximity, we presented the hand and cursor either in orthogonal planes (horizontal and frontal, respectively) or we aligned them in the horizontal plane. We did not find the expected enhanced integration strength in the latter spatial condition. As a secondary question we asked whether spatial transformations required for the position judgments (i.e., horizontal to frontal or vice versa) could be the origin of previously observed suboptimal variances of the integrated hand and cursor position judgments. We found, however, that the suboptimality persisted when spatial transformations were omitted (i.e., with the hand and cursor in the same plane). Our findings thus clearly show that the integration of actions with their visual effects is, at least for cursor control, independent of spatial proximity.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.