The purpose of the study was to assess the influence of real-time auditory feedback on knee proprioception. Thirty healthy participants were randomly allocated to control (n = 15), and experimental group I (15). The participants performed an active knee-repositioning task using their dominant leg, with/without additional real-time auditory feedback where the frequency was mapped in a convergent manner to two different target angles (40 and 75 • ). Statistical analysis revealed significant enhancement in knee re-positioning accuracy for the constant and absolute error with real-time auditory feedback, within and across the groups. Besides this convergent condition, we established a second divergent condition. Here, a step-wise transposition of frequency was performed to explore whether a systematic tuning between auditory-proprioceptive repositioning exists. No significant effects were identified in this divergent auditory feedback condition. An additional experimental group II (n = 20) was further included. Here, we investigated the influence of a larger magnitude and directional change of step-wise transposition of the frequency. In a first step, results confirm the findings of experiment I. Moreover, significant effects on knee auditory-proprioception repositioning were evident when divergent auditory feedback was applied. During the step-wise transposition participants showed systematic modulation of knee movements in the opposite direction of transposition. We confirm that knee re-positioning accuracy can be enhanced with concurrent application of real-time auditory feedback and that knee re-positioning can modulated in a goal-directed manner with step-wise transposition of frequency. Clinical implications are discussed with respect to joint position sense in rehabilitation settings.
The pattern of gait after hip arthroplasty strongly affects regeneration and quality of life. Acoustic feedback could be a supportive method for patients to improve their walking ability and to regain a symmetric and steady gait. In this study, a new gait sonification method with two different modes—real-time feedback (RTF) and instructive model sequences (IMS)—is presented. The impact of the method on gait symmetry and steadiness of 20 hip arthroplasty patients was investigated. Patients were either assigned to a sonification group (SG) (n = 10) or a control group (CG) (n = 10). All of them performed 10 gait training sessions (TS) lasting 20 min, in which kinematic data were measured using an inertial sensor system. Results demonstrate converging step lengths of the affected and unaffected leg over time in SG compared with a nearly parallel development of both legs in CG. Within the SG, a higher variability of stride length and stride time was found during the RTF training mode in comparison to the IMS mode. Therefore, the presented dual mode method provides the potential to support gait rehabilitation as well as home-based gait training of orthopedic patients with various restrictions.
When two individuals interact in a collaborative task, such as carrying a sofa or a table, usually spatiotemporal coordination of individual motor behavior will emerge. In many cases, interpersonal coordination can arise independently of verbal communication, based on the observation of the partners' movements and/or the object's movements. In this study, we investigate how social coupling between two individuals can emerge in a collaborative task under different modes of perceptual information. A visual reference condition was compared with three different conditions with new types of additional auditory feedback provided in real time: effect-based auditory feedback, performance-based auditory feedback, and combined effect/performance-based auditory feedback. We have developed a new paradigm in which the actions of both participants continuously result in a seamlessly merged effect on an object simulated by a tablet computer application. Here, participants should temporally synchronize their movements with a 90° phase difference and precisely adjust the finger dynamics in order to keep the object (a ball) accurately rotating on a given circular trajectory on the tablet. Results demonstrate that interpersonal coordination in a joint task can be altered by different kinds of additional auditory information in various ways.
The background of our study is to apply advanced real-time gait analysis to walking interventions in daily-life setting. A vast of wearable devices provide gait information but not more than pedometer functions such as step counting, displacement and velocity. This paper suggests a real-time gait analysis method based on a head-worn inertial measurement unit (H-IMU). A novel analysis method implements real-time detection of gait events (heel strike, toe off, mid stance phase) and immediately provides detailed spatiotemporal parameters. The reliability of this method was proven by a measurement with over 11000 steps from seven participants on a 400 m outdoor track. The advanced gait analysis was conducted without any limitation of a fixed reference frame (e.g., indoor stage, infrared cameras). The mean absolute error in step-counting was 0.24%. Compared to a pedometer, additional gait parameters were obtained such as foot-ground contact time (CT) and contact time ratio (CTR). The gait monitoring system can be used as real-time and long-term feedback, which is applicable in the management of the health status and on injury prevention.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.