Proceedings of IEEE International Conference on Robotics and Automation
DOI: 10.1109/robot.1996.509257
|View full text |Cite
|
Sign up to set email alerts
|

Unifying configuration space and sensor space for vision-based motion planning

Abstract: Visual feedback can play a crucial role in a dynamic robotic task such as the interception of a moving target. To utilize the feedback eflectively, there is a need t o develop robot motion planning techniques that also take into account properties of the sensed data. W e propose a motion planning framework that achieves this with the help of a space called the Perceptual Control Manifold (PCMJ defined o n the product of the robot configuration space and a n image-based feature space. W e show how the task of i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 11 publications
0
3
0
Order By: Relevance
“…On the other hand, an image-based servoing process bypasses the 3D world reconstruction and uses images features directly to control robot motion. [8][9][10][11][12] Image-based servoing observes how differential changes in robot configuration space relate to differential changes in image features space, and then uses this derived relationship and the expected goal features to control robot motion. The disadvantage of the image-based approach is that the control goal is hard to specify with changing camera configurations.…”
Section: Introductionmentioning
confidence: 99%
“…On the other hand, an image-based servoing process bypasses the 3D world reconstruction and uses images features directly to control robot motion. [8][9][10][11][12] Image-based servoing observes how differential changes in robot configuration space relate to differential changes in image features space, and then uses this derived relationship and the expected goal features to control robot motion. The disadvantage of the image-based approach is that the control goal is hard to specify with changing camera configurations.…”
Section: Introductionmentioning
confidence: 99%
“…There is thus a growing need for extending the configuration space planning paradigm to bridge the gap between planning and sensing so that the motion plans can benefit by optimally utilizing the available sensing mechanism. In [2] we proposed a motion planning framework that achieves this with the help of a space called the Perceptual Control Manifold or PCM. The PCM is a manifold defined on the product of the robot configuration space (defined in terms of the objects in the image), as opposed to photometric features (pixel intensities, colors, etc.…”
Section: Incorporating Sensor Constraints Intomentioning
confidence: 99%
“…Unfortunately, in most motion planning approaches, sensing is completely decoupled from planning. In [2] we present a framework for motion planning that considers sensors as an integral part of the definition of the motion goal. The approach is based on the concept of the Perceptual Control Manifold (PCM), de-0-7803-2978-3/96/$5.00 @ 1996 IEEE fined on the product of the robot C-space and sensor space.…”
Section: Introductionmentioning
confidence: 99%