Motor or visual impairments may prevent a user from steering a wheelchair effectively in indoor environments. In such cases, joystick jerks arising from uncontrolled motions may lead to collisions with obstacles. We here propose a perceptive shared control system that progressively corrects the trajectory as a user manually drives the wheelchair, by means of a sensor-based shared control law capable of smoothly avoiding obstacles. This control law is based on a low complex optimization framework validated through simulations and extensive clinical trials. The provided model uses distance information. Therefore, for low-cost considerations, we use ultrasonic sensors to measure the distances around the wheelchair. The solution therefore provides an efficient assistive tool that does not alter the quality of experience perceived by the user, while ensuring his security in hazardous situations.
This paper presents a new algorithm for an efficient computation of morphological operations for gray images and its specific hardware. The method is based on a new recursive morphological decomposition method of 8-convex structuring elements by only causal twopixel structuring elements (2PSE). Whatever the element size, erosion or/and dilation can then be performed during a unique raster-like image scan involving a fixed reduced analysis neighborhood. The resulting process offers low computation complexity combined with easy description of the element form. The dedicated hardware is generic and fully regular, built from elementary interconnected stages. It has been synthesized into an FPGA and achieves high frequency performances for any shape and size of structuring element.
International audienceNavigating within an unknown indoor environment using an electric wheelchair is a challenging task, especially if the user suffers from severe disabilities. In order to reduce fatigability and increase autonomy, control architectures have to be designed that would assist users in wheelchair navigation. We present a framework for vision-based autonomous indoor navigation in an electric wheelchair capable of following corridors, and passing through open doorways using a single doorpost. Visual features extracted from cameras on board the wheelchair are used as inputs for image based controllers built-in the wheelchair. It has to be noted that no a-priori information is utilised except for the assumption that the wheelchair moves in a typical indoor environment while the system is coarsely calibrated. The designed control schemes have been implemented onto a robotized wheelchair and experimental results show the robust behaviour of the designed system
Needle insertion procedures can greatly benefit from robotic systems to improve their accuracy and success rate. However, a fully automated system is usually not desirable and the clinicians need to be included in the control loop. In this paper we present a teleoperation framework for beveledtip flexible needle steering that enables the user to directly and intuitively control the trajectory of the needle tip via a haptic interface. The 6 degrees of freedom of the needle base are used to perform several automatic safety and targeting tasks in addition to the one controlled by the user. Real-time visual feedback is provided by a 3D ultrasound probe and used to track the 3D location of the needle and of a spherical target. Several haptic force feedback are compared as well as two different levels of mix between automated and user-controlled tasks. A validation of the framework is conducted in gelatin phantom and a mean targeting accuracy of 2.5 mm is achieved. The results show that providing an adequate haptic guidance to the user can reduce the risks of damage to the tissues while still letting the surgeon in control of the tip trajectory.
People with severe disabilities often rely on power wheelchairs for moving around. However, if their driving abilities are affected by their condition, driving a power wheelchair can become very dangerous, both for themselves and the surrounding environment. This paper proposes the use of wearable vibrotactile haptics for wheelchair navigation assistance. We use one or two haptic armbands, each composed of four evenly-spaced vibrotactile actuators, for providing different navigation information to power wheelchair users. With respect to other available solutions, our approach provides rich navigation information while always leaving the patient in control of the wheelchair motion. Moreover, our armbands can be easily adapted for different limbs and can be used by all those patients who are unable to safely maneuver a kinesthetic interface. The results of two human subjects studies show the viability and effectiveness of the proposed technique with respect to not providing any environmental cue. Collisions were reduced by 49% when using the vibrotactile armbands. Moreover, most subjects expressed a preference for receiving haptic feedback and found the armbands comfortable to wear and use.
Autonomy and social inclusion can reveal themselves everyday challenges for people experiencing mobility impairments. These people can benefit from technical aids such as power wheelchairs to access mobility and overcome social exclusion. However, power wheelchair driving is a challenging task which requires good visual, cognitive and visuo-spatial abilities. Besides, a power wheelchair can cause material damage or represent a danger of injury for others or oneself if not operated safely. Therefore, training and repeated practice are mandatory to acquire safe driving skills to obtain power wheelchair prescription from therapists. However, conventional training programs may reveal themselves insufficient for some people with severe impairments. In this context, Virtual Reality offers the opportunity to design innovative learning and training programs while providing realistic wheelchair driving experience within a virtual environment. In line with this, we propose a user-centered design of a multisensory power wheelchair simulator. This simulator addresses classical virtual experience drawbacks such as cybersickness and sense of presence by combining 3D visual rendering, haptic feedback and motion cues. It relies on a modular and versatile workflow enabling not only easy interfacing with any virtual display, but also with any user interface such as wheelchair controllers or feedback devices. This paper presents the design of the first implementation as well as its first commissioning through pretests. The first setup achieves consistent and realistic behavior.
Abstract-Some people with severe mobility impairments are unable to operate powered wheelchairs reliably and effectively, using commercially available interfaces. This has sparked a body of research into "smart wheelchairs", which assist users to drive safely and create opportunities for them to use alternative interfaces. Various "shared control" techniques have been proposed to provide an appropriate level of assistance that is satisfactory and acceptable to the user. Most shared control techniques employ a traditional strategy called linear blending (LB), where the user's commands and wheelchair's autonomous commands are combined in some proportion. In this paper, however, we implement a more generalised form of shared control called probabilistic shared control (PSC). This probabilistic formulation improves the accuracy of modelling the interaction between the user and the wheelchair by taking into account uncertainty in the interaction. In this paper, we demonstrate the practical success of PSC over LB in terms of safety, particularly for novice users.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.