Functional movement disorders require attention to manifest yet patients report the abnormal movement to be out of their control. In this study we explore the phenomenon of sensory attenuation, a measure of the sense of agency for movement, in this group of patients by using a force matching task. Fourteen patients and 14 healthy control subjects were presented with forces varying from 1 to 3 N on the index finger of their left hand. Participants were required to match these forces; either by pressing directly on their own finger or by operating a robot that pressed on their finger. As expected, we found that healthy control subjects consistently overestimated the force required when pressing directly on their own finger than when operating a robot. However, patients did not, indicating a significant loss of sensory attenuation in this group of patients. These data are important because they demonstrate that a fundamental component of normal voluntary movement is impaired in patients with functional movement disorders. The loss of sensory attenuation has been correlated with the loss of sense of agency, and may help to explain why patients report that they do not experience the abnormal movement as voluntary.
The recent discovery of melanopsin-containing retinal ganglion cells (mRGCs) has led to a fundamental reassessment of non-image forming processing, such as circadian photoentrainment and the pupillary light reflex. In the conventional view of retinal physiology, rods and cones were assumed to be the only photoreceptors in the eye and were, therefore, considered responsible for non-image processing. However, signals from mRGCs contribute to this non-image forming processing along with cone-mediated luminance signals; although both signals contribute, it is unclear how these signals are summed. We designed and built a novel multi-primary stimulation system to stimulate mRGCs independently of other photoreceptors using a silent-substitution technique within a bright steady background. The system allows direct measurements of pupillary functions for mRGCs and cones. We observed a significant change in steady-state pupil diameter when we varied the excitation of mRGC alone, with no change in luminance and colour. Furthermore, the change in pupil diameter induced by mRGCs was larger than that induced by a variation in luminance alone: that is, for a bright steady background, the mRGC signals contribute to the pupillary pathway by a factor of three times more than the L-and M-cone signals.
Lifting an object requires precise scaling of fingertip forces based on a prediction of object weight. At object contact, a series of tactile and visual events arise that need to be rapidly processed online to fine-tune the planned motor commands for lifting the object. The brain mechanisms underlying multisensory integration serially at transient sensorimotor events, a general feature of actions requiring hand-object interactions, are not yet understood. In this study we tested the relative weighting between haptic and visual signals when they are integrated online into the motor command. We used a new virtual reality setup to desynchronize visual feedback from haptics, which allowed us to probe the relative contribution of haptics and vision in driving participants’ movements when they grasped virtual objects simulated by two force-feedback robots. We found that visual delay changed the profile of fingertip force generation and led participants to perceive objects as heavier than when lifts were performed without visual delay. We further modeled the effect of vision on motor output by manipulating the extent to which delayed visual events could bias the force profile, which allowed us to determine the specific weighting the brain assigns to haptics and vision. Our results show for the first time how visuo-haptic integration is processed at discrete sensorimotor events for controlling object-lifting dynamics and further highlight the organization of multisensory signals online for controlling action and perception. NEW & NOTEWORTHY Dexterous hand movements require rapid integration of information from different senses, in particular touch and vision, at different key time points as movement unfolds. The relative weighting between vision and haptics for object manipulation is unknown. We used object lifting in virtual reality to desynchronize visual and haptic feedback and find out their relative weightings. Our findings shed light on how rapid multisensory integration is processed over a series of discrete sensorimotor control points.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.