2009 IEEE International Conference on Robotics and Automation 2009
DOI: 10.1109/robot.2009.5152482
|View full text |Cite
|
Sign up to set email alerts
|

Navigation through urban environments by visual perception and interaction

Abstract: In the Autonomous City Explorer (ACE) project a mobile robot is developed, which is capable of finding its way to a given destination in an unknown urban environment. An exemplary mission is to find the way from our institute to the Marienplatz, a public place in the center of Munich, without any prior knowledge or GPS information. Inspired by the behavior of humans in unknown environments, ACE must find its way by asking pedestrians. The route is about 1.5 kilometers far and includes heavily traveled roads an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2009
2009
2020
2020

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 16 publications
(8 citation statements)
references
References 14 publications
(10 reference statements)
0
8
0
Order By: Relevance
“…Starting with the detected skin parts, the algorithm segments the point cloud into smaller clusters and fits a 28 degrees of freedom kinematic human model into the scene, thus retrieving the body pose and the direction of the pointing gesture. For further information on the Vision module please refer to [23].…”
Section: A the Autonomous City Explorer -Acementioning
confidence: 99%
“…Starting with the detected skin parts, the algorithm segments the point cloud into smaller clusters and fits a 28 degrees of freedom kinematic human model into the scene, thus retrieving the body pose and the direction of the pointing gesture. For further information on the Vision module please refer to [23].…”
Section: A the Autonomous City Explorer -Acementioning
confidence: 99%
“…For interaction with humans a touchscreen and speakers are used. A robotic head which comprises of five cameras and an animated mouth is mounted on the robot, for gesture recognition and people tracking [6]. More details on the hardware components can be found in [5].…”
Section: System Descriptionmentioning
confidence: 99%
“…The vision system of the robot is used to track the person [6]. The tracker runs at 10 Hz and its outputs are send to the Behavior Control.…”
Section: A Robot Behavior Descriptionmentioning
confidence: 99%
See 2 more Smart Citations