Virtual home-based rehabilitation is an emerging area in stroke rehabilitation. Functional assessment tools are essential to monitor recovery and provide current function-based rehabilitation. We developed the Fugl-Meyer Assessment (FMA) tool using Kinect (Microsoft, USA) and validated it for hemiplegic stroke patients. Forty-one patients with hemiplegic stroke were enrolled. Thirteen of 33 items were selected for upper extremity motor FMA. One occupational therapist assessed the motor FMA while recording upper extremity motion with Kinect. FMA score was calculated using principal component analysis and artificial neural network learning from the saved motion data. The degree of jerky motion was also transformed to jerky scores. Prediction accuracy for each of the 13 items and correlations between real FMA scores and scores using Kinect were analyzed. Prediction accuracies ranged from 65% to 87% in each item and exceeded 70% for 9 items. Correlations were high for the summed score for the 13 items between real FMA scores and scores obtained using Kinect (Pearson’s correlation coefficient = 0.873, P<0.0001) and those between total upper extremity scores (66 in full score) and scores using Kinect (26 in full score) (Pearson’s correlation coefficient = 0.799, P<0.0001). Log transformed jerky scores were significantly higher in the hemiplegic side (1.81 ± 0.76) compared to non-hemiplegic side (1.21 ± 0.43) and showed significant negative correlations with Brunnstrom stage (3 to 6; Spearman correlation coefficient = -0.387, P = 0.046). FMA using Kinect is a valid way to assess upper extremity function and can provide additional results for movement quality in stroke patients. This may be useful in the setting of unsupervised home-based rehabilitation.
A prototype system that replaces the conventional time-lapse imaging in microscopic inspection for use with smartphones is presented. Existing time-lapse imaging requires a video data feed between a microscope and a computer that varies depending on the type of image grabber. Even with proper hardware setups, a series of tedious and repetitive tasks is still required to relocate to the region-of-interest (ROI) of the specimens. In order to simplify the system and improve the efficiency of time-lapse imaging tasks, a smartphone-based platform utilizing microscopic augmented reality (μ-AR) markers is proposed. To evaluate the feasibility and efficiency of the proposed system, a user test was designed and performed, measuring the elapse time for a trial of the task starting from the execution of the application software to the completion of restoring and imaging of an ROI saved in advance. The results of the user test showed that the average elapse time was 65.3 ± 15.2 s with 6.86 ± 3.61 μm of position error and 0.08 ± 0.40 degrees of angle error. This indicates that the time-lapse imaging task was accomplished rapidly with a high level of accuracy. Thus, simplification of both the system and the task was achieved via our proposed system.
This study presents a series of protocols of designing and manufacturing a glasses-type wearable device that detects the patterns of temporalis muscle activities during food intake and other physical activities. We fabricated a 3D-printed frame of the glasses and a load cell-integrated printed circuit board (PCB) module inserted in both hinges of the frame. The module was used to acquire the force signals, and transmit them wirelessly. These procedures provide the system with higher mobility, which can be evaluated in practical wearing conditions such as walking and waggling. A performance of the classification is also evaluated by distinguishing the patterns of food intake from those physical activities. A series of algorithms were used to preprocess the signals, generate feature vectors, and recognize the patterns of several featured activities (chewing and winking), and other physical activities (sedentary rest, talking, and walking). The results showed that the average F1 score of the classification among the featured activities was 91.4%. We believe this approach can be potentially useful for automatic and objective monitoring of ingestive behaviors with higher accuracy as practical means to treat ingestive problems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.