Abstract-Motion analysis technologies have been widely used to monitor the potential for injury and enhance athlete performance. However, most of these technologies are expensive, can only be used in laboratory environments and examine only a few trials of each movement action. In this paper, we present a novel ambulatory motion analysis framework using wearable inertial sensors to accurately assess all of an athlete's activities in real training environment. We firstly present a system that automatically classifies a large range of training activities using the Discrete Wavelet Transform (DWT) in conjunction with a Random forest classifier. The classifier is capable of successfully classifying various activities with up to 98% accuracy. Secondly, a computationally efficient gradient descent algorithm is used to estimate the relative orientations of the wearable inertial sensors mounted on the shank, thigh and pelvis of a subject, from which the flexion-extension knee and hip angles are calculated. These angles, along with sacrum impact accelerations, are automatically extracted for each stride during jogging. Finally, normative data is generated and used to determine if a subject's movement technique differed to the normative data in order to identify potential injury related factors. For the joint angle data this is achieved using a curve-shift registration technique. It is envisaged that the proposed framework could be utilized for accurate and automatic sports activity classification and reliable movement technique evaluation in various unconstrained environments for both injury management and performance enhancement.
Abstract-Motion analysis technologies have been widely used to monitor the potential for injury and enhance athlete performance. However, most of these technologies are expensive, can only be used in laboratory environments and examine only a few trials of each movement action. In this paper, we present a novel ambulatory motion analysis framework using wearable inertial sensors to accurately assess all of an athlete's activities in an outdoor training environment. We firstly present a system that automatically classifies a large range of training activities using the Discrete Wavelet Transform (DWT) in conjunction with a Random forest classifier. The classifier is capable of successfully classifying various activities with up to 98% accuracy. Secondly, a computationally efficient gradient descent algorithm is used to estimate the relative orientations of the wearable inertial sensors mounted on the thigh and shank of a subject, from which the flexion-extension knee angle is calculated. Finally, a curve shift registration technique is applied to both generate normative data and determine if a subject's movement technique differed to the normative data in order to identify potential injury related factors. It is envisaged that the proposed framework could be utilized for accurate and automatic sports activity classification and reliable movement technique evaluation in various unconstrained environments.
Abstract. In this paper, we investigate efficient recognition of human gestures / movements from multimedia and multimodal data, including the Microsoft Kinect and translational and rotational acceleration and velocity from wearable inertial sensors. We firstly present a system that automatically classifies a large range of activities (17 different gestures) using a random forest decision tree. Our system can achieve near real time recognition by appropriately selecting the sensors that led to the greatest contributing factor for a particular task. Features extracted from multimodal sensor data were used to train and evaluate a customized classifier. This novel technique is capable of successfully classifying various gestures with up to 91 % overall accuracy on a publicly available data set. Secondly we investigate a wide range of different motion capture modalities and compare their results in terms of gesture recognition accuracy using our proposed approach. We conclude that gesture recognition can be effectively performed by considering an approach that overcomes many of the limitations associated with the Kinect and potentially paves the way for low-cost gesture recognition in unconstrained environments.
We present a demonstration of a multi-modal 3D capturing platform coupled to a motion comparison system. This work is focused on the preservation of Traditional Sports and Games, namely the Gaelic sports from Ireland and Basque sports from France and Spain. Users can learn, compare and compete in the performance of sporting gestures and compare themselves to real athletes. Our online gesture database provides a way to preserve and display a wide range of sporting gestures. The capturing devices utilised are Kinect 2 sensors and wearable inertial sensors, where the number required varies based on the requested scenario. The fusion of these two capture modalities, coupled to our inverse kinematic algorithm, allow us to synthesize a fluid and reliable 3D model of the user gestures over time. Our novel comparison algorithms provide the user with a performance score and a set of comparison curves (i.e. joint angles and angular velocities), providing a precise and valuable feedback for coaches and players.
Abstract. In this paper, we target enhanced 3D reconstruction of non-rigidly deforming objects based on a view-independent surface representation with an automated recursive filtering scheme. This work improves upon the KinectDeform algorithm which we recently proposed. KinectDeform uses an implicit viewdependent volumetric truncated signed distance function (TSDF) based surface representation. The view-dependence makes its pipeline complex by requiring surface prediction and extraction steps based on camera's field of view. This paper proposes to use an explicit projection-based Moving Least Squares (MLS) surface representation from point-sets. Moreover, the empirical weighted filtering scheme in KinectDeform is replaced by an automated fusion scheme based on a Kalman filter. We analyze the performance of the proposed algorithm both qualitatively and quantitatively and show that it is able to produce enhanced and feature preserving 3D reconstructions.
Abstract-We present a novel framework to monitor the threedimensional trajectory (orientation and position) of a golf swing using miniaturized inertial sensors. Firstly we employed a highly accurate and computationally efficient revised gradient descent algorithm to obtain the orientation of a golf club. Secondly, we designed a series of digital filters to determine the backward and forward segments of the swing, enabling us to calculate drift-free linear velocity along with the relative 3D position of the golf club during the entire swing. Finally, the calculated motion trajectory was verified against a ground truth VICON system using Iterative Closest Point (ICP) in conjunction with Principal Component Analysis (PCA). The computationally efficient framework present here achieves a high level of accuracy (r = 0.9885, p < 0.0001) for such a low-cost system. This framework can be utilized for reliable movement technique evaluation and can provide near real-time feedback for athletes in various unconstrained environments. It is envisaged that the proposed framework is applicable to other racket based sports (e.g. tennis, cricket and hurling).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.