1989
DOI: 10.1109/70.88086
|View full text |Cite
|
Sign up to set email alerts
|

Vision-guided servoing with feature-based trajectory generation (for robots)

Abstract: Absstract-This paper presents a vision module which is able toguide an eye-in-hand robot through general servoing and tracking problems using off-the-shelf image processing equipment. The vision module uses the location of binary image features from a camera on the robot's endeffector to control the position and one degree of orientation of the robot manipulator. A unique feature-based trajectory generator provides smooth motion between the actual image features and the desired image features even with asynchr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
95
0
2

Year Published

1996
1996
2017
2017

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 281 publications
(97 citation statements)
references
References 14 publications
0
95
0
2
Order By: Relevance
“…We used it to guide the HP3JC. There are two main approaches to visual servoing: Position Based Visual Servo (PBVS) [40][41][42] and Image Based Visual Servo (IBVS) [43][44][45][46][47][48]. Generally speaking, PBVS uses visual data to reconstruct the 3D world and allows researchers to design control algorithms in Cartesian space.…”
Section: System Descriptionmentioning
confidence: 99%
“…We used it to guide the HP3JC. There are two main approaches to visual servoing: Position Based Visual Servo (PBVS) [40][41][42] and Image Based Visual Servo (IBVS) [43][44][45][46][47][48]. Generally speaking, PBVS uses visual data to reconstruct the 3D world and allows researchers to design control algorithms in Cartesian space.…”
Section: System Descriptionmentioning
confidence: 99%
“…Most experimental systems reported in the literature operate at relatively low sample rates, and have low resolution or small field of view (FOV). An explicit trajectory synthesis algorithm is often employed due to the low visual sample rates (Feddema and Mitchell 1989). Many of these systems have to use low control gains for stable and low speed closed-loop motion.…”
Section: Nomenclaturementioning
confidence: 99%
“…Notice that visual servo structures require the direct access to the robot operating system in order to govern the joint variables directly, whereas dynamic look and move structures consider the robot as an independent system and can handle it as a black box. There has been a significant amount of research activity on image based control methods (Weiss et al, 1987), (Feddema & Mitchell, 1989), (Chaumette et al, 1991), (Espiau et al, 1992), (Khosla et al, 1993), (Corke, 1993) whereas there have been only a few researchers working on position based control methods (Koivo & Houshangi, 1991), (Allen et al, 1993), (Wilson et al, 1996), (Vargas et al, 1999). This tendency can be justified because image based systems usually reduce computation delays and eliminate errors due to sensor modeling and camera calibration processes.…”
Section: Overview Of Visual Control Architecturesmentioning
confidence: 99%