Sounds offer a rich source of information about events taking place in our physical and social environment. However, outside the domains of speech and music, little is known about whether humans can recognize and act upon the intentions of another agent’s actions detected through auditory information alone. In this study we assessed whether intention can be inferred from the sound an action makes, and in turn, whether this information can be used to prospectively guide movement. In 2 experiments experienced and novice basketball players had to virtually intercept an attacker by listening to audio recordings of that player’s movements. In the first experiment participants had to move a slider, while in the second one their body, to block the perceived passage of the attacker as they would in a real basketball game. Combinations of deceptive and nondeceptive movements were used to see if novice and/or experienced listeners could perceive the attacker’s intentions through sound alone. We showed that basketball players were able to more accurately predict final running direction compared to nonplayers, particularly in the second experiment when the interceptive action was more basketball specific. We suggest that athletes present better action anticipation by being able to pick up and use the relevant kinematic features of deceptive movement from event-related sounds alone. This result suggests that action intention can be perceived through the sound a movement makes and that the ability to determine another person’s action intention from the information conveyed through sound is honed through practice.
Grasping movements are typically performed toward visually sensed objects. However, planning and execution of grasping movements can be supported also by haptic information when we grasp objects held in the other hand. In the present study we investigated this sensorimotor integration process by comparing grasping movements towards objects sensed through visual, haptic or visuo-haptic signals. When movements were based on haptic information only, hand preshaping was initiated earlier, the digits closed on the object more slowly, and the final phase was more cautious compared to movements based on only visual information. Importantly, the simultaneous availability of vision and haptics led to faster movements and to an overall decrease of the grip aperture. Our findings also show that each modality contributes to a different extent in different phases of the movement, with haptics being more crucial in the initial phases and vision being more important for the final on-line control. Thus, vision and haptics can be flexibly combined to optimize the execution of grasping movement.
The aim of the study is to reveal the role of sound in action anticipation and performance, and to test whether the level of precision in action planning and execution is related to the level of sensorimotor skills and experience that listeners possess about a specific action. Individuals ranging from 18 to 75 years of age - some of them without any skills in skateboarding and others experts in this sport - were compared in their ability to anticipate and simulate a skateboarding jump by listening to the sound it produces. Only skaters were able to modulate the forces underfoot and to apply muscle synergies that closely resembled the ones that a skater would use if actually jumping on a skateboard. More importantly we showed that only skaters were able to plan the action by activating anticipatory postural adjustments about 200 ms after the jump event. We conclude that expert patterns are guided by auditory events that trigger proper anticipations of the corresponding patterns of movements.
Haptics provides information about the size and position of a handheld object. However, it is still unknown how haptics contributes to action correction if a sudden perturbation causes a change in the configuration of the handheld object. In this study, we have occasionally perturbed the size of an object that was the target of a right-hand reach-to-grasp movement. In some cases, participants were holding the target object with their left hand, which provided haptic information about the object perturbation. We compared the corrective responses to perturbations in three different sensory conditions: visual (participants had full vision of the object, but haptic information from the left hand was prevented), haptic (object size was sensed by the left hand and vision was prevented), and visuo-haptic (both visual and haptic information were available throughout the movement). We found that haptic inputs evoked faster contralateral corrections than visual inputs, although actions in haptic and visual conditions were similar in movement duration. Strikingly, the corrective responses in the visuo-haptic condition were as fast as those found in the haptic condition, a result that is contrary to that predicted by simple summation of unisensory signals. These results suggest the existence of a haptomotor reflex that can trigger automatic and efficient grasping corrections of the contralateral hand that are faster than those initiated by the well-known visuomotor reflex and the tactile-motor reflex. NEW & NOTEWORTHY We show that online grip aperture corrections during grasping actions are contingent on the sensory modality used to detect the object perturbation. We found that sensing perturbations with the contralateral hand only (haptics) leads to faster action corrections than when object perturbations are only visually sensed. Moreover, corrections following visuo-haptic perturbations were as fast as those to haptic perturbations. Thus a haptomotor reflex triggers faster automatic responses than the visuomotor reflex.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.