Gait analysis for the patients with lower limb motor dysfunction is a useful tool in assisting clinicians for diagnosis, assessment, and rehabilitation strategy making. Implementing accurate automatic gait analysis for the hemiparetic patients after stroke is a great challenge in clinical practice. This study is to develop a new automatic gait analysis system for qualitatively recognizing and quantitatively assessing the gait abnormality of the post-stroke hemiparetic patients. Twenty-one post-stroke patients and twenty-one healthy volunteers participated in the walking trials. Three of the most representative gait data, i.e., marker trajectory (MT), ground reaction force (GRF), and electromyogram, were simultaneously acquired from these subjects during their walking. A multimodal fusion architecture is established by using these different modal data to qualitatively distinguish the hemiparetic gait from normal gait by different pattern recognition techniques and to quantitatively estimate the patient's lower limb motor function by a novel probability-based gait score. Seven decision fusion algorithms have been tested in this architecture, and extensive data analysis experiments have been conducted. The results indicate that the recognition performance and estimation performance of the system become better when more modal gait data are fused. For the recognition performance, the random forest classifier based on the GRF data achieves an accuracy of 92.26% outperformed other single-modal schemes. When combining two modal data, the accuracy can be enhanced to 95.83% by using the support vector machine (SVM) fusion algorithm to fuse the MT and GRF data. When integrating all the three modal data, the accuracy can be further improved to 98.21% by using the SVM fusion algorithm. For the estimation performance, the absolute values of the correlation coefficients between the estimation results of the above three schemes and the Wisconsin gait scale scores for the post-stroke patients are 0.63, 0.75, and 0.84, respectively, which means the clinical relevance becomes more obvious when using more modalities. These promising results demonstrate that the proposed method has considerable potential to promote the future design of automatic gait analysis systems for clinical practice.
In this study, a multimodal fusion framework based on three different modal biosignals is developed to recognize human intentions related to lower limb multi-joint motions which commonly appear in daily life. Electroencephalogram (EEG), electromyogram (EMG) and mechanomyogram (MMG) signals were simultaneously recorded from twelve subjects while performing nine lower limb multi-joint motions. These multimodal data are used as the inputs of the fusion framework for identification of different motion intentions. Twelve fusion techniques are evaluated in this framework and a large number of comparative experiments are carried out. The results show that a support vector machine-based three-modal fusion scheme can achieve average accuracies of 98.61%, 97.78% and 96.85%, respectively, under three different data division forms. Furthermore, the relevant statistical tests reveal that this fusion scheme brings significant accuracy improvement in comparison with the cases of two-modal fusion or only a single modality. These promising results indicate the potential of the multimodal fusion framework for facilitating the future development of human-robot interaction for lower limb rehabilitation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.