Human walking is a very challenging task and always requires rigorous practice. It is a learning process that involves the complex coordination of the brain and lower limbs. The bipedal robots that mimic the human morphological structure to produce human similar walking, are not capable of producing an efficient walk. Due to walking challenges and structural differences, a robot cannot walk like a human being. In this research, to achieve the aforementioned objective to produce a human similar walk, human lower extremity activities are considered to understand walking behaviour. The experiment involves different walking styles on different terrains. To capture the learning process of bipedal robot locomotion, a deep learning‐based ensemble classifier is introduced for human lower activities recognition. To understand the learning process seven different walking activities are considered for analysis purposes. An Inertial measurement unit (IMU) is used as a wearable device due to its small form factor and unobtrusive nature to capture the walking movement of different lower limbs joints. Three public datasets viz. mHealth, OU‐ISIR similar action and HAPT inertial sensor data sets are considered for this study. To classify the activities, 2 different deep learning models namely convolutional neural network (CNN) and long short‐term memory (LSTM) are used. To generalize the results, an ensemble of different classifiers is implemented. The Classifier has reported accuracy of 99.25%, 88.48% and 97.44%, respectively, on the aforementioned data sets. This work can be utilized for elderly subjects' postural stability, rehabilitation of patients post‐stroke and trauma, generation of robot walk trajectories in cluttered environment and reconstruction of impaired walking.
Human gait data can be collected using inertial measurement units (IMUs). An IMU is an electronic device that uses an accelerometer and gyroscope to capture three-axial linear acceleration and three-axial angular velocity. The data so collected are time series in nature. The major challenge associated with these data is the segmentation of signal samples into stride-specific information, that is, individual gait cycles. One empirical approach for stride segmentation is based on timestamps. However, timestamping is a manual technique, and it requires a timing device and a fixed laboratory set-up which usually restricts its applicability outside of the laboratory. In this study, we have proposed an automatic technique for stride segmentation of accelerometry data for three different walking activities. The autocorrelation function (ACF) is utilized for the identification of stride boundaries. Identification and extraction of stride-specific data are done by devising a concept of tuning parameter (
$t_{p}$
) which is based on minimum standard deviation (
$\sigma$
). Rigorous experimentation is done on human activities and postural transition and Osaka University – Institute of Scientific and Industrial Research gait inertial sensor datasets. Obtained mean stride duration for level walking, walking upstairs, and walking downstairs is 1.1, 1.19, and 1.02 s with 95% confidence interval [1.08, 1.12], [1.15, 1.22], and [0.97, 1.07], respectively, which is on par with standard findings reported in the literature. Limitations of accelerometry and ACF are also discussed. stride segmentation; human activity recognition; accelerometry; gait parameter estimation; gait cycle; inertial measurement unit; autocorrelation function; wearable sensors; IoT; edge computing; tinyML.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.