Abstract-This paper presents a visual-servoing method that is based on 2-D ultrasound (US) images. The main goal is to guide a robot actuating a 2-D US probe in order to reach a desired cross-section image of an object of interest. The method we propose allows the control of both in-plane and out-of-plane probe motions. Its feedback visual features are combinations of moments extracted from the observed image. The exact analytical form of the interaction matrix that relates the image-moments time variation to the probe velocity is developed, and six independent visual features are proposed to control the six degrees of freedom of the robot. In order to endow the system with the capability of automatically interacting with objects of unknown shape, a model-free visual servoing is developed. For that, we propose an efficient online estimation method to identify the parameters involved in the interaction matrix. Results obtained in both simulations and experiments validate the methods presented in this paper and show their robustness to different errors and perturbations, especially those inherent to the noisy US images.
In this paper, we propose a nonlinear controller that stabilizes unmanned aerial vehicles in GPS-denied environments with respect to visual targets by using only onboard sensing. The translational velocity of the vehicle is estimated online with a nonlinear observer, which exploits spherical visual features as the main source of information. With the proposed solution, only four visual features have shown to be enough for the observer to operate in a real scenario. In addition, the observer is computationally light with constant numerical complexity, involving small-dimension matrices. The observer output is then exploited in a nonlinear controller designed with an integral backstepping approach, thus yielding a novel robust control system. By means of Lyapunov analysis, the stability of the closed-loop system is proved. Extensive simulation and experimental tests with a quadrotor are carried out to verify the validity and robustness of the proposed approach. The control system runs fully onboard on a standard processor, and only a low-cost sensing suite is employed. Tracking of a target whose speed exceeds 2 m/s is also considered in the real-hardware experiments.Index Terms-Image-based visual servoing, nonlinear controller, nonlinear observer, unmanned aerial vehicle (UAV), velocity estimation.
This paper presents a new image-based servo control scheme that endows an unmanned aerial vehicle (UAV) equipped with a robotic arm with the capability of automatically positioning elements on target objects. Through a new formalism, the proposed visual servo scheme controls both the UAV and the manipulator simultaneously. It takes into account the whole system redundancy as well as the peculiarity of under-actuation related to rotary-wing crafts. While it controls the system at the velocity level, it makes use of the mobility afforded by the UAV and the dexterity inherent to robot manipulators. The case of large initial errors is explicitly addressed. Results of simulations are reported to verify the effectiveness of the proposed approach.
Abstract-A new visual servoing method based on B-mode ultrasound images is proposed to automatically control the motion of a 2D ultrasound probe held by a medical robot in order to reach a desired B-scan image of an object of interest. In this approach, combinations of image moments extracted from the current observed object cross-section are used as feedback visual features. The analytical form of the interaction matrix, relating the time variation of these visual features to the probe velocity, is derived and used in the control law. Simulations performed with a static ultrasound volume containing an egg-shaped object, and in-vitro experiments using a robotized ultrasound probe that interacts with a rabbit heart immersed in water, show the validity of this new approach and its robustness with respect to modeling and measurements errors.
Manipulation tasks carried out with aerial platforms composed of a UAV and a robotic arm involve crosscoupled dynamics between these subsystems. This paper proposes a new controller for this class of aerial robotic systems that allows regulating on velocity commands generated by an outer image-based visual-servo scheme. The controller, that considers the full dynamics of the system, is designed based on the integral backstepping approach. Visual feedback provided by an onboard camera is employed into a new visual servo scheme to simultaneously generate velocity commands for the UAV and the manipulator so that a visual target is reached. The control system takes into account the under-actuation related to rotary-wing vehicles, while at the same time it exploits the functionality system redundancy to achieve the task. Simulation results validate the proposed control system, as well as its robustness to large modeling error and measurement noise.
This paper proposes a new visual servo control scheme that endows flying manipulators with the capability of positioning with respect to visual targets. A camera attached to the UAV provides real-time images of the scene. We consider the approaching part of an aerial assembling task, where the manipulator carries a structure to be plugged into the visual target. In order to augment the system capabilities regarding the 3D interaction with the target, we propose to use image moments. The developed controller generates desired velocities to both the UAV and the manipulator, simultaneously. While taking into account the under-actuation specific to rotary-wing vehicles, it makes use of the system redundancy to realize potential sub-tasks. The joints limits avoidance is also guaranteed. The presented developments are validated by means of computer simulations.
In this paper a method for the on-line absolutescale velocity estimation of a system composed of a single camera and of an inertial measurement unit is presented. The proposed formulation makes use of spherical image measurements acquired from at least three camera positions and inertial measurements to estimate the system velocity by solving also the absolute scale problem. A new multi-rate formulation based on a sliding least-squares estimation formulation is proposed, which is capable of providing the velocity estimation also in cases of constant and zero velocity. The effectiveness of the proposed approach is shown through extensive simulations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.