Background
A proper modeling of human grasping and of hand movements is fundamental for robotics, prosthetics, physiology and rehabilitation. The taxonomies of hand grasps that have been proposed in scientific literature so far are based on qualitative analyses of the movements and thus they are usually not quantitatively justified.
Methods
This paper presents to the best of our knowledge the first quantitative taxonomy of hand grasps based on biomedical data measurements. The taxonomy is based on electromyography and kinematic data recorded from 40 healthy subjects performing 20 unique hand grasps. For each subject, a set of hierarchical trees are computed for several signal features. Afterwards, the trees are combined, first into modality-specific (i.e. muscular and kinematic) taxonomies of hand grasps and then into a general quantitative taxonomy of hand movements. The modality-specific taxonomies provide similar results despite describing different parameters of hand movements, one being muscular and the other kinematic.
Results
The general taxonomy merges the kinematic and muscular description into a comprehensive hierarchical structure. The obtained results clarify what has been proposed in the literature so far and they partially confirm the qualitative parameters used to create previous taxonomies of hand grasps. According to the results, hand movements can be divided into five movement categories defined based on the overall grasp shape, finger positioning and muscular activation. Part of the results appears qualitatively in accordance with previous results describing kinematic hand grasping synergies.
Conclusions
The taxonomy of hand grasps proposed in this paper clarifies with quantitative measurements what has been proposed in the field on a qualitative basis, thus having a potential impact on several scientific fields.
The interest on wearable prosthetic devices has boost the research for a robust framework to help injured subjects to regain their lost functionality. A great number of solutions exploit physiological human signals, such as Electromyography (EMG), to naturally control the prosthesis, reproducing what happens in the human limbs. In this paper, we propose for the first time a way to integrate EMG signals with Inertial Measurement Unit (IMU) information, as a way to improve subject-independent models for controlling robotic hands. EMG data are very sensitive to both physical and physiological variations, and this is particularly true between different subjects. The introduction of IMUs aims at enriching the subject-independent model, making it more robust with information not strictly dependent from the physiological characteristics of the subject. We compare three different models: the first based on EMG solely, the second merging data from EMG and the 2 best IMUs available, and the third using EMG and IMUs information corresponding to the same 3 electrodes. The three techniques are tested on two different movements executed by 35 healthy subjects, by using a leaveone-out approach. The framework is able to estimate online the bending angles of the joints involved in the motion, obtaining an accuracy up to 0.8634. The resulting joint angles are used to actuate a robotic hand in a simulated environment.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.