Objective. To quantify the concurrent accuracy and the test-retest reliability of a Kinect V2-based upper limb functional assessment system. Approach. Ten healthy males performed a series of upper limb movements, which were measured concurrently with Kinect V2 and the Vicon motion capture system (gold standard). Each participant attended two testing sessions, seven days apart. Four tasks were performed including hand to contralateral shoulder, hand to mouth, combing hair, and hand to back pocket. Upper limb kinematics were calculated using our developed kinematic model and the UWA model for Kinect V2 and Vicon. The interdevice coefficient of multiple correlation (CMC) and the root mean squared error (RMSE) were used to evaluate the validity of the kinematic waveforms. Mean absolute bias and Pearson’s r correlation were used to evaluate the validity of the angles at the points of target achieved (PTA) and the range of motion (ROM). The intersession CMC and RMSE and the intraclass correlation coefficient (ICC) were used to assess the test-retest reliability of Kinect V2. Main Results. Both validity and reliability are found to be task-dependent and plane-dependent. Kinect V2 had good accuracy in measuring shoulder and elbow flexion/extension angular waveforms (CMC>0.87), moderate accuracy of measuring shoulder adduction/abduction angular waveforms (CMC=0.69-0.82), and poor accuracy of measuring shoulder internal/external angles (CMC<0.6). We also found high test-retest reliability of Kinect V2 in most of the upper limb angular waveforms (CMC=0.75-0.99), angles at the PTA (ICC=0.65-0.91), and the ROM (ICC=0.68-0.96). Significance. Kinect V2 has great potential as a low-cost, easy implemented device for assessing upper limb angular waveforms when performing functional tasks. The system is suitable for assessing relative within-person change in upper limb motions over time, such as disease progression or improvement due to intervention.
We develop a deep learning refined kinematic model for accurately assessing upper limb joint angles using a single Kinect v2 sensor. We train a long short-term memory recurrent neural network using a supervised machine learning architecture to compensate for the systematic error of the Kinect kinematic model, taking a marker-based three-dimensional motion capture system (3DMC) as the golden standard. A series of upper limb functional task experiments were conducted, namely hand to the contralateral shoulder, hand to mouth or drinking, combing hair, and hand to back pocket. Our deep learning-based model significantly improves the performance of a single Kinect v2 sensor for all investigated upper limb joint angles across all functional tasks. Using a single Kinect v2 sensor, our deep learning-based model could measure shoulder and elbow flexion/extension waveforms with mean CMCs >0.93 for all tasks, shoulder adduction/abduction, and internal/external rotation waveforms with mean CMCs >0.8 for most of the tasks. The mean deviations of angles at the point of target achieved and range of motion are under 5° for all investigated joint angles during all functional tasks. Compared with the 3DMC, our presented system is easier to operate and needs less laboratory space.
Low-cost, portable, and easy-to-use Kinect-based systems achieved great popularity in out-of-the-lab motion analysis. The placement of a Kinect sensor significantly influences the accuracy in measuring kinematic parameters for dynamics tasks. We conducted an experiment to investigate the impact of sensor placement on the accuracy of upper limb kinematics during a typical upper limb functional task, the drinking task. Using a 3D motion capture system as the golden standard, we tested twenty-one Kinect positions with three different distances and seven orientations. Upper limb joint angles, including shoulder flexion/extension, shoulder adduction/abduction, shoulder internal/external rotation, and elbow flexion/extension angles, are calculated via our developed Kinect kinematic model and the UWA kinematic model for both the Kinect-based system and the 3D motion capture system. We extracted the angles at the point of the target achieved (PTA). The mean-absolute-error (MEA) with the standard represents the Kinect-based system’s performance. We conducted a two-way repeated measure ANOVA to explore the impacts of distance and orientation on the MEAs for all upper limb angles. There is a significant main effect for orientation. The main effects for distance and the interaction effects do not reach statistical significance. The post hoc test using LSD test for orientation shows that the effect of orientation is joint-dependent and plane-dependent. For a complex task (e.g., drinking), which involves body occlusions, placing a Kinect sensor right in front of a subject is not a good choice. We suggest that place a Kinect sensor at the contralateral side of a subject with the orientation around 30∘ to 45∘ for upper limb functional tasks. For all kinds of dynamic tasks, we put forward the following recommendations for the placement of a Kinect sensor. First, set an optimal sensor position for capture, making sure that all investigated joints are visible during the whole task. Second, sensor placement should avoid body occlusion at the maximum extension. Third, if an optimal location cannot be achieved in an out-of-the-lab environment, researchers could put the Kinect sensor at an optimal orientation by trading off the factor of distance. Last, for those need to assess functions of both limbs, the users can relocate the sensor and re-evaluate the functions of the other side once they finish evaluating functions of one side of a subject.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.