BackgroundThe primary aim of this study was to assess the level of engagement in computer-based simulations of functional tasks, using a haptic device for people with chronic traumatic brain injury. The objectives were to design functional tasks using force feedback device and determine if it could measure motor performance improvement.MethodsA prospective crosssectional study was performed in a biomedical research facility. The testing environment consisted of a single, interactive, stylus-driven computer session navigating virtual scenes in 3D space. Subjects had a haptic training session (TRAIN) and then had three chances to perform each virtual task: (i) remove tools from a workbench (TOOL), (ii) compose 3 letter words (SPELL), (iii) manipulate utensils to prepare a sandwich (SAND), and (iv) tool use (TUSE). Main Outcome Measures included self-report of engagement in the activities, improved performance on simulated tasks and observer estimate as measured by time to completion or number of words completed from baseline, correlations among performance measures and self-reports of boredom, neuropsychological symptom inventory (NSI), and The Purdue Peg Motor Test (PPT).ResultsParticipants were 19 adults from the community with a 1 year history of non-penetrating traumatic brain injury (TBI) and were able to use computers. Seven had mild, 3 moderate and 9 severe TBIs. Mean score on the Boredom Proneness Scale (BPS): 107 (normal range 81–117); mean NSI:32; mean PPT 54 (normal range for assembly line workers >67). Responses to intervention: 3 (15%)subjects did not repeat all three trials of the tasks; 100% reported they were highly engaged in the interactions; 6 (30%) reported they had a high level of frustration with the tasks, but completed them with short breaks. Performance measures: Comparison of baseline to post training: TOOL time decreased by (mean) 60 sec; SPELL increased by 2.7 words; TUSE time decreased by (mean) 68 sec; and SAND time decreased by (mean) 72 sec. PPT correlated with TOOL (r=−0.65, p=0.016) and TUSE time (r=−0.6, p=0.014). SPELL correlated with Boredom score (r=0.41, p=0.08) and NSI (r=−.49, p=0.05).ConclusionPeople with chronic TBI of various ages and severity report being engaged in using haptic devices that interact with 3D virtual environments. Haptic devices are able to capture objective data that provide useful information about fine motor and cognitive performance.
Gait analysis has been an interesting area of research for several decades. In this paper, we propose image-flow-based methods to compute the motion and velocities of different body segments automatically, using a single inexpensive video camera. We then identify and extract different events of the gait cycle (double-support, mid-swing, toe-off and heel-strike) from video images. Experiments were conducted in which four walking subjects were captured from the sagittal plane. Automatic segmentation was performed to isolate the moving body from the background. The head excursion and the shank motion were then computed to identify the key frames corresponding to different events in the gait cycle. Our approach does not require calibrated cameras or special markers to capture movement. We have also compared our method with the Optotrak 3D motion capture system and found our results in good agreement with the Optotrak results. The development of our method has potential use in the markerless and unencumbered video capture of human locomotion. Monitoring gait in homes and communities provides a useful application for the aged and the disabled. Our method could potentially be used as an assessment tool to determine gait symmetry or to establish the normal gait pattern of an individual.
We present a novel approach to gait analysis using ensemble Kalman filtering which permits markerless determination of segmental movement. We use image flow analysis to reliably compute temporal and kinematic measures including the translational velocity of the torso and rotational velocities of the lower leg segments. Detecting the instances where velocity changes direction also determines the standard events of a gait cycle (double-support, toe-off, mid-swing and heel-strike). In order to determine the kinematics of lower limbs, we model the synergies between the lower limb motions (thigh-shank, shank-foot) by building a nonlinear dynamical system using CMUs 3D motion capture database. This information is fed into the ensemble Kalman Filter framework to estimate the unobserved limb (upper leg and foot) motion from the measured lower leg rotational velocity. Our approach does not require calibrated cameras or special markers to capture movement. We have tested our method on different gait sequences collected from the sagttal plane and presented the estimated kinematics overlaid on the original image frames. We have also validated our approach by manually labeling the videos and comparing our results against them.
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.