Compared with rigid objects, grasping and lifting compliant objects presents additional uncertainties. For any static grasp, forces at the fingertips depend on factors including the locations of the contact points and the contact forces must be coordinated to maintain equilibrium. For compliant objects, the locations and orientations of the contact surfaces change in a force-dependent manner, thus changing the force requirements. Furthermore, every force adjustment then results in additional changes in object shape. This study characterized force and muscle activation patterns in this situation. Fingertip forces were measured as subjects grasped and lifted a 200-g object using their thumb, index, and ring fingers. A spring was sometimes placed under the index and/or ring finger contact surface. Surface electromyographic activity was recorded from ten hand muscles and one proximal arm muscle. The patterns of grip (normal) force and muscle activity were similar across conditions during the load and lift phases, but their amplitude depended on whether the contact surface was compliant. Specifically, the grip force increased smoothly during the load phase of the task under all conditions. To the contrary, the tangential contact (load) force did not increase monotonically when one or more of the contact surfaces were compliant, resulting in a decoupling of the grip and load forces.
Rotation of an object held with three fingers is produced by modulation of force amplitude and direction at one or more contact points. Changes in the moment arm through which these forces act can also contribute to the modulation of the rotational moment. Therefore force amplitude and direction as well as the center of pressure on each contact surface must be carefully coordinated to produce a rotation. Because there is not a single solution, this study sought to describe consistent strategies for simple position-to-position rotations in the pitch, roll, and yaw axes. Force amplitude and direction, and center of pressure on the contact surfaces (and thus the moment arm), were measured as human subjects rotated a 420 g force-transducer instrumented object, grasped with the thumb, index and ring fingers (average movement time: 500 ms). Electromyographic (EMG) activity was recorded from five intrinsic and three extrinsic hand muscles and two wrist muscles. Principal components analysis of force and EMG revealed just two main temporal patterns: the main one followed rotational position and the secondary one had a time course that resembled that of rotational velocity. Although the task could have been accomplished by dynamic modulation of the activity of wrist muscles alone, these two main dynamic EMG patterns were seen in intrinsic hand muscles as well. In contrast to previous reports of shifting in time of the phasic activity bursts of various muscles, in this task, all EMG records were well described by just two temporal patterns, resembling the position and velocity traces.
Multiple sensory modalities gather information about our surroundings to plan appropriate movements based on the properties of the environment and the objects within it. This study was designed to examine the sensitivity of visual and haptic information alone and together for detecting curvature. When both visual and haptic information were present, temporal delays in signal onset were used to determine the effect of asynchronous sensory information on the interference of vision on the haptic estimate of curvature. Even under the largest temporal delays where visual and haptic information were clearly disparate, the presentation of visual information influenced the haptic perception of curvature. The uncertainty associated with the unimodal vision condition was smaller than that in the unimodal haptic condition, regardless of whether the haptic information was procured actively or under robot assistance for curvature detection. When both visual and haptic information were available, the uncertainty was not reduced; it was equal to that of the unimodal haptic condition. The weighting of the visual and haptic information was highly variable across subjects with some subjects making judgments based largely on haptic information, while others tended to rely on visual information equally or to a larger extent than the haptic information.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.