The development of natural interfaces for human-robot interaction\ud
provides the user an intuitive way to control and guide robots. In\ud
this paper, we propose a novel ROS (Robot Operating System)-\ud
integrated interface for remote control that allows the user to teleoperate\ud
the robot using his hands motion. The user can adjust online\ud
the autonomy of the robot between two levels: direct control and\ud
waypoint following. The hand tracking and gestures recognition\ud
capabilities of the Leap Motion device are exploited to generate the\ud
control commands. The user receives a real-time 3D augmented\ud
visual feedback using a Kinect sensor and a HMD. To assess the\ud
practicability of the system experimental results are presented using\ud
as a benchmark the remote control of a Kuka Youbot
This work investigates the use of a highly immersive telepresence system for industrial robotics. A Robot Operating System integrated framework is presented where a remote robot is controlled through operator's movements and muscle contractions captured with a wearable device. An augmented 3D visual feedback is sent to the user providing the remote environment scenario from the robot's point of view and additional information pertaining to the task execution. The system proposed, using robot mounted RGB-D camera, identifies known objects and relates their pose to robot arm pose and to targets relevant to the task execution. The system is preliminary validated during a pick-and-place task using a Baxter robot. The experiment shows the practicability and the effectiveness of the proposed approach.
The shortage of physicians afflicting developed\ud
countries encourages engineers and doctors to collaborate\ud
towards the development of telemedicine. In particular, robotic\ud
systems have the potential for helping doctors making examination.\ud
A very common examination that can be the goal of\ud
a robotic system is palpation. Most of the robotics systems\ud
that have been developed for palpation present interesting\ud
features such as integrating augmented reality environments or\ud
allowing for hand free interaction. In this paper we present\ud
a novel palpation system that allows us to perform virtual\ud
palpation of real objects by means of a haptic and an augmented\ud
reality feedback. This system features an encountered-type\ud
haptic interface in which the haptic feedback is calculated by a\ud
collision detection algorithm that is based on online recording\ud
of the surface to be touched. The system allows the users to\ud
remove their hand from the haptic interface end-effector that\ud
follows the user?s hand thanks to the tracking performed by\ud
a Leap Motion. We show that the system provides a natural\ud
interaction during the contact-non contact switch, a suitable\ud
force during indentation, and it allows to discriminate objects\ud
within the body through the haptic channel
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.