Robotic wheelchairs with built-in assistive features, such as shared control, are an emerging means of providing independent mobility to severely disabled individuals. However, patients often struggle to build a mental model of their wheelchair's behaviour under different environmental conditions. Motivated by the desire to help users bridge this gap in perception, we propose a novel augmented reality system using a Microsoft Hololens as a head-mounted aid for wheelchair navigation. The system displays visual feedback to the wearer as a way of explaining the underlying dynamics of the wheelchair's shared controller and its predicted future states. To investigate the influence of different interface design options, a pilot study was also conducted. We evaluated the acceptance rate and learning curve of an immersive wheelchair training regime, revealing preliminary insights into the potential beneficial and adverse nature of different augmented reality cues for assistive navigation. In particular, we demonstrate that care should be taken in the presentation of information, with effort-reducing cues for augmented information acquisition (for example, a rear-view display) being the most appreciated.
Using industrial robots to spray structures has been investigated extensively, however interesting challenges emerge when using handheld spraying robots. In previous work we have demonstrated the use of shared control of a handheld spraying robot to assist a user in a 3D spraying task. In this paper we demonstrate the use of Augmented Reality Interfaces to increase the user's progress and task awareness. We describe our solutions to challenging calibration issues between the Microsoft Hololens system and a motion capture system without the for well defined markers or careful alignment on the part of the user. Error relative to the motion capture system was shown to be 10mm after only a 4 second calibration routine. Secondly we outline a logical approach for visualising liquid density for an augmented reality spraying task, this system allows the user to see target regions to complete, areas that are complete and areas that have been overdosed clearly. Finally we produced a user study to investigate the level of assistance that a handheld robot utilising shared control methods should provide during a spraying task. Using a handheld spraying robot with a moving spray head did not aid the user much over simply actuating spray nozzle for them. Compared to manual control the automatic modes significantly reduced the task load experienced by the user and significantly increased the quality of the result of the spraying task, reducing the error by 33-45%.
We present a shared control method of painting 3D geometries, using a handheld robot which has a single autonomously controlled degree of freedom. The user scans the robot near to the desired painting location, the single movement axis moves the spray head to achieve the required paint distribution. A simultaneous simulation of the spraying procedure is performed, giving an open loop approximation of the current state of the painting. An online prediction of the best path for the spray nozzle actuation is calculated in a receding horizon fashion. This is calculated by producing a map of the paint required in the 2D space defined by nozzle position on the gantry and the time into the future. A directed graph then extracts its edge weights from this paint density map and Dijkstra's algorithm is then used to find the candidate for the most effective path. Due to the heavy parallelisation of this approach and the majority of the calculations taking place on a GPU we can run the prediction loop in 32.6ms for a prediction horizon of 1 second, this approach is computationally efficient, outperforming a greedy algorithm. The path chosen by the proposed method on average chooses a path in the top 15% of all paths as calculated by exhaustive testing. This approach enables development of real time path planning for assisted spray painting onto complicated 3D geometries. This method could be applied to applications such as assistive painting for people with disabilities, or accurate placement of liquid when large scale positioning of the head is too expensive.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.