Many biomechanical analyses rely on the availability of reliable body segment inertia parameter (BSIP) estimates. Current processes to obtain these estimates involve many time consuming manual measurements of the human body, used in conjunction with models or equations. While such methods have become the accepted standard they contain many inherent errors arising from manual measurement and significant assumptions made in the underlying data used to form the models and equations. Presented here is an alternative approach to obtaining reliable estimates of body segment inertia parameters through the use of the Microsoft Kinect sensor. A 3D scanning system was developed, comprising four Kinects aligned to a single global coordinate system using rigid body calibration and random sample consensus (RANSAC) optimisation. The system offers the advantage of obtaining BSIP estimates in a single scanning operation of around three seconds, much quicker than the circa thirty minutes of manual measurements required for existing BSIP estimation methods. The results obtained with the system show a mean error of 0.04% and a standard deviation of 2.11% in volumetric measurements of a torso manikin, suggesting comparable and in many cases, greater accuracy volumetric estimates than a commonly used geometric BSIP model. Further work is needed to extend this study to include a full range of BSIP measurements across more of the bodies segments and to include scanning of living human subjects. However, this initial study suggests great potential for a low cost system that can provide quick and accurate subject specific BSIP estimates.
Since the introduction of the Microsoft Kinect in November 2010, low cost consumer depth cameras have rapidly increased in popularity. Their integral technology provides a means of low cost 3D scanning, extending its accessibility to a far wider audience. Previous work has shown the 3D data from consumer depth cameras to exhibit fundamental measurement errors: likely due to their low cost and original intended application. A number of techniques to correct the errors are presented in the literature, but are typically device specific, or rely on specific open source drivers. Presented here is a simple method of calibrating consumer depth cameras, relying only on 3D scans of a plane filling the field of view: thereby compatible with any device capable of providing 3D point cloud data. Validation of the technique using a Microsoft Kinect sensor has shown non planarity errors to reduce to around ± 3mm: nearing the device's resolution. Further validation based on circumference measures of a cylindrical object has shown a variable error of up to 45mm to reduce to a systematic overestimation of 10mm, based on a 113mm diameter cylinder. Further work is required to test the proposed method on objects of greater complexity and over greater distances. However, this initial work suggests great potential for a simple method of reducing the error apparent in the 3D data from consumer depth cameras: possibly increasing their suitability for a number of applications.
Use of anthropometric data to infer sporting performance is increasing in popularity, particularly within elite sport programmes. Measurement typically follows standards set by the International Society for the Advancement of Kinanthropometry (ISAK). However, such techniques are time consuming, which reduces their practicality. Schranz et al. recently suggested 3D body scanners could replace current measurement techniques; however, current systems are costly. Recent interest in natural user interaction has led to a range of low-cost depth cameras capable of producing 3D body scans, from which anthropometrics can be calculated. A scanning system comprising 4 depth cameras was used to scan 4 cylinders, representative of the body segments. Girth measurements were calculated from the 3D scans and compared to gold standard measurements. Requirements of a Level 1 ISAK practitioner were met in all 4 cylinders, and ISO standards for scan-derived girth measurements were met in the 2 larger cylinders only. A fixed measurement bias was identified that could be corrected with a simple offset factor. Further work is required to determine comparable performance across a wider range of measurements performed upon living participants. Nevertheless, findings of the study suggest such a system offers many advantages over current techniques, having a range of potential applications.
Recent literature suggests that 2D and 3D anthropometric measures are better predictors of sports performance, than traditional 1D measures. The emergence of 3D scanning systems offers a cheap, easy and effective method of estimating these measures. Therefore the aim of this study was to investigate the repeatability of a depth camera based 3D scanning system, and its agreement with manual methods in the extraction of simple thigh measurements. Using 15 healthy, recreationally active male participants, five measurements of the thigh (upper thigh circumference, mid-thigh circumference, knee circumference, knee to mid-thigh length and mid-thigh to upper thigh length) were taken using an anthropometric tape measure and digital callipers, and scanned using a 4camera Kinect based 3D scanning system (using custom analysis software). Agreement and repeatability was subsequently determined. This study demonstrated a low cost Kinect-based 3D scanning system is capable of extracting length and circumference measures within ~2% and ~3-4%, respectively, with high repeatability, technical error measurements (TEM) of ~1.80% and ~0.7% respectively. The 3D scanning system was able to measure the thigh in good agreement with manual measurement methods, with the presence of systematic bias in circumference. Whilst maintaining a very high degree of repeatability, suggesting it is a suitable method to extract simple thigh measurements.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.