This paper presents a novel methodology for estimating the gait phase of human walking through a simple sensory apparatus. Three subsystems are combined: a primary phase estimator based on adaptive oscillators, a desired gait event detector and a phase error compensator. The estimated gait phase is expected to linearly increase from 0 to 2(Formula presented.) rad in one stride and remain continuous also when transiting to the next stride. We designed two experimental scenarios to validate this gait phase estimator, namely treadmill walking at different speeds and free walking. In the case of treadmill walking, the maximum phase error at the desired gait events was found to be 0.155 rad, and the maximum phase difference between the end of the previous stride and beginning of the current stride was 0.020 rad. In the free walking trials, phase error at the desired gait event was never larger than 0.278 rad. Our algorithm outperformed against two other benchmarked methods. The good performance of our gait phase estimator could provide consistent and finely tuned assistance for an exoskeleton designed to augment the mobility of patients
Bioinspiration in robotics deals with applying biological principles to the design of better performing devices. In this article, we propose a novel bioinspired framework using motor primitives for locomotion assistance through a wearable cooperative exoskeleton. In particular, the use of motor primitives for assisting different locomotion modes (i.e., ground-level walking at several cadences and ascending and descending stairs) is explored by means of two different strategies. In the first strategy, identified motor primitives are combined through weights to directly produce the desired assistive torque profiles. In the second strategy, identified motor primitives are combined to serve as neural stimulations to a virtual model of the musculoskeletal system, which, in turn, produces the desired assistive torque profiles
An emerging approach to design locomotion assistive devices deals with reproducing desirable biological principles of human locomotion. In this paper, we present a bio-inspired controller for locomotion assistive devices based on the concept of motor primitives. The weighted combination of artificial primitives results in a set of virtual muscle stimulations. These stimulations then activate a virtual musculoskeletal model producing reference assistive torque profiles for different locomotion tasks (i.e., walking, ascending stairs, and descending stairs). The paper reports the validation of the controller through a set of experiments conducted with healthy participants. The proposed controller was tested for the first time with a unilateral leg exoskeleton assisting hip, knee, and ankle joints by delivering a fraction of the computed reference torques. Importantly, subjects performed a track involving ground-level walking, ascending stairs, and descending stairs and several transitions between these tasks. These experiments highlighted the capability of the controller to provide relevant assistive torques and to effectively handle transitions between the tasks. Subjects displayed a natural interaction with the device. Moreover, they significantly decreased the time needed to complete the track when the assistance was provided, as compared to wearing the device with no assistance.
This work presents a novel controller for robotic hands that regulates the grasp stiffness by manipulating the pose and the finger joint stiffness of hands with multiple degrees of freedom while guaranteeing the grasp stability. The proposed approach is inspired by the observations in human motor behaviour that reveal a coordinated pattern of stiffening among the hand fingers, along with a predictive selection of the hand pose to achieve a reliable grasp. The first adjusts the magnitude of the grasp stiffness, while the latter manipulates its overall geometry (shape). The realization of a similar control approach in robotic hands can result in a reduction of the software complexity and also promote a novel mechanical design approach, in which the finger stiffness profiles of the hand are adjusted by only one active component. The proposed control is validated with the fully actuated Allegro Hand, while trying to achieve pre-defined grasp stiffness profiles or modifications of an initial one.
The objective of this paper is to develop and evaluate a directional vibrotactile feedback interface as a guidance tool for postural adjustments during work. In contrast to the existing active and wearable systems such as exoskeletons, we aim to create a lightweight and intuitive interface, capable of guiding its wearers towards more ergonomic and healthy working conditions. To achieve this, a vibrotactile device called ErgoTac is employed to develop three different feedback modalities that are able to provide a directional guidance at the body segments towards a desired pose. In addition, an evaluation is made to find the most suitable, comfortable, and intuitive feedback modality for the user. Therefore, these modalities are first compared experimentally on fifteen subjects wearing eight ErgoTac devices to achieve targeted arm and torso configurations. The most effective directional feedback modality is then evaluated on five subjects in a set of experiments in which an ergonomic optimisation module provides the optimised body posture while performing heavy lifting or forceful exertion tasks. The results yield strong evidence on the usefulness and the intuitiveness of one of the developed modalities in providing guidance towards ergonomic working conditions, by minimising the effect of an external load on body joints. We believe that the integration of such lowcost devices in workplaces can help address the well-known and complex problem of work-related musculoskeletal disorders.
This work presents a bio-inspired grasp stiffness control for robotic hands based on the concepts of Common Mode Stiffness (CMS) and Configuration Dependent Stiffness (CDS). Using an ellipsoid representation of the desired grasp stiffness, the algorithm focuses on achieving its geometrical features. Based on preliminary knowledge of the fingers workspace, the method starts by exploring the possible hand poses that maintain the grasp contacts on the object. This outputs a first selection of feasible grasp configurations providing the base for the CDS control. Then, an optimization is performed to find the minimum joint stiffness (CMS control) that would stabilize these grasps. This joint stiffness can be increased afterwards depending on the task requirements. The algorithm finally chooses among all the found stable configurations the one that results in a better approximation of the desired grasp stiffness geometry (CDS). The proposed method results in a reduction of the control complexity, needing to independently regulate the joint positions, but requiring only one input to produce the desired joint stiffness. Moreover, the usage of the fingers pose to attain the desired grasp stiffness results in a more energy-efficient configuration than only relying on the joint stiffness (i.e., joint torques) modifications. The control strategy is evaluated using the fully actuated Allegro Hand while grasping a wide variety of objects. Different desired grasp stiffness profiles are selected to exemplify several stiffness geometries.
In this manuscript, we present an online scalable tele-impedance framework, which enables the individual and collaborative control of multiple different robotic platforms. The framework provides an intuitive low-cost interface with visual feedback and a SpaceMouse, through which the operator can define the desired task-level trajectories and impedance profiles. With a simple SpaceMouse click, the user can switch between the robots and the collaborative operation mode. The control, subsequently, manages the distribution of the required parameters into the involved robots. Thanks to the introduced virtual hand concept where each robot is defined as a finger, new robots can be easily added or removed via their kinodynamic parameters. The proposed framework was evaluated with three different experiments: a simulated auscultation on a mock-up patient, a cooperative task where a robot drives the patient on a wheelchair and a different robot performs the auscultation, and a collaborative task where two robots relocate a container. The results demonstrate the capabilities of the framework in terms of adaptability to different robotic platforms, the number of robots involved, and the task requirements. Additionally, quantitative and subjective analysis of 12 subjects showed how the developed interface, even in the presence of inaccurate visual feedback, allowed a smooth and accurate execution of the tasks.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.