Walking ability is frequently assessed with the 10-meter walking test (10MWT), which may be instrumented with multiple Kinect v2 sensors to complement the typical stopwatch-based time to walk 10 meters with quantitative gait information derived from Kinect’s 3D body point’s time series. The current study aimed to evaluate a multi-Kinect v2 set-up for quantitative gait assessments during the 10MWT against a gold-standard motion-registration system by determining between-systems agreement for body point’s time series, spatiotemporal gait parameters and the time to walk 10 meters. To this end, the 10MWT was conducted at comfortable and maximum walking speed, while 3D full-body kinematics was concurrently recorded with the multi-Kinect v2 set-up and the Optotrak motion-registration system (i.e., the gold standard). Between-systems agreement for body point’s time series was assessed with the intraclass correlation coefficient (ICC). Between-systems agreement was similarly determined for the gait parameters’ walking speed, cadence, step length, stride length, step width, step time, stride time (all obtained for the intermediate 6 meters) and the time to walk 10 meters, complemented by Bland-Altman’s bias and limits of agreement. Body point’s time series agreed well between the motion-registration systems, particularly so for body points in motion. For both comfortable and maximum walking speeds, the between-systems agreement for the time to walk 10 meters and all gait parameters except step width was high (ICC ≥ 0.888), with negligible biases and narrow limits of agreement. Hence, body point’s time series and gait parameters obtained with a multi-Kinect v2 set-up match well with those derived with a gold standard in 3D measurement accuracy. Future studies are recommended to test the clinical utility of the multi-Kinect v2 set-up to automate 10MWT assessments, thereby complementing the time to walk 10 meters with reliable spatiotemporal gait parameters obtained objectively in a quick, unobtrusive and patient-friendly manner.
The ability to adapt walking to environmental circumstances is an important aspect of walking, yet difficult to assess. The Interactive Walkway was developed to assess walking adaptability by augmenting a multi-Kinect-v2 10-m walkway with gait-dependent visual context (stepping targets, obstacles) using real-time processed markerless full-body kinematics. In this study we determined Interactive Walkway's usability for walking-adaptability assessments in terms of between-systems agreement and sensitivity to task and subject variations. Under varying task constraints, 21 healthy subjects performed obstacle-avoidance, sudden-stops-and-starts and goal-directed-stepping tasks. Various continuous walking-adaptability outcome measures were concurrently determined with the Interactive Walkway and a gold-standard motion-registration system: available response time, obstacle-avoidance and sudden-stop margins, step length, stepping accuracy and walking speed. The same holds for dichotomous classifications of success and failure for obstacle-avoidance and sudden-stops tasks and performed short-stride versus long-stride obstacle-avoidance strategies. Continuous walking-adaptability outcome measures generally agreed well between systems (high intraclass correlation coefficients for absolute agreement, low biases and narrow limits of agreement) and were highly sensitive to task and subject variations. Success and failure ratings varied with available response times and obstacle types and agreed between systems for 85-96% of the trials while obstacle-avoidance strategies were always classified correctly. We conclude that Interactive Walkway walking-adaptability outcome measures are reliable and sensitive to task and subject variations, even in high-functioning subjects. We therefore deem Interactive Walkway walking-adaptability assessments usable for obtaining an objective and more task-specific examination of one's ability to walk, which may be feasible for both high-functioning and fragile populations since walking adaptability can be assessed at various levels of difficulty.
Microsoft’s HoloLens, a mixed-reality headset, provides, besides holograms, rich position data of the head, which can be used to quantify what the wearer is doing (e.g., walking) and to parameterize such acts (e.g., speed). The aim of the current study is to determine test-retest reliability, concurrent validity, and face validity of HoloLens 1 for quantifying spatiotemporal gait parameters. This was done in a group of 23 healthy young adults (mean age 21 years) walking at slow, comfortable, and fast speeds, as well as in a group of 24 people with Parkinson’s disease (mean age 67 years) walking at comfortable speed. Walking was concurrently measured with HoloLens 1 and a previously validated markerless reference motion-registration system. We comprehensively evaluated HoloLens 1 for parameterizing walking (i.e., walking speed, step length and cadence) in terms of test-retest reliability (i.e., consistency over repetitions) and concurrent validity (i.e., between-systems agreement), using the intraclass correlation coefficient (ICC) and Bland–Altman’s bias and limits of agreement. Test-retest reliability and between-systems agreement were excellent for walking speed (ICC ≥ 0.861), step length (ICC ≥ 0.884), and cadence (ICC ≥ 0.765), with narrower between-systems than over-repetitions limits of agreement. Face validity was demonstrated with significantly different walking speeds, step lengths and cadences over walking-speed conditions. To conclude, walking speed, step length, and cadence can be reliably and validly quantified from the position data of the wearable HoloLens 1 measurement system, not only for a broad range of speeds in healthy young adults, but also for self-selected comfortable speed in people with Parkinson’s disease.
Mixed-reality technologies are evolving rapidly, allowing for gradually more realistic interaction with digital content while moving freely in real-world environments. In this study, we examined the suitability of the Microsoft HoloLens mixed-reality headset for creating locomotor interactions in real-world environments enriched with 3D holographic obstacles. In Experiment 1, we compared the obstacle-avoidance maneuvers of 12 participants stepping over either real or holographic obstacles of different heights and depths. Participants’ avoidance maneuvers were recorded with three spatially and temporally integrated Kinect v2 sensors. Similar to real obstacles, holographic obstacles elicited obstacle-avoidance maneuvers that scaled with obstacle dimensions. However, with holographic obstacles, some participants showed dissimilar trail or lead foot obstacle-avoidance maneuvers compared to real obstacles: they either consistently failed to raise their trail foot or crossed the obstacle with extreme lead-foot margins. In Experiment 2, we examined the efficacy of mixed-reality video feedback in altering such dissimilar avoidance maneuvers. Participants quickly adjusted their trail-foot crossing height and gradually lowered extreme lead-foot crossing heights in the course of mixed-reality video feedback trials, and these improvements were largely retained in subsequent trials without feedback. Participant-specific differences in real and holographic obstacle avoidance notwithstanding, the present results suggest that 3D holographic obstacles supplemented with mixed-reality video feedback may be used for studying and perhaps also training 3D obstacle avoidance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.