2007 IEEE/RSJ International Conference on Intelligent Robots and Systems 2007
DOI: 10.1109/iros.2007.4399384
|View full text |Cite
|
Sign up to set email alerts
|

Mobile robot outdoor localization using planar beacons and visual improved odometry

Abstract: This paper presents experimental results on the localization of a mobile robot equipped with relative frequent and absolute infrequent sensors. The relative sensors used are two: a wheel based odometry and a visual based odometry. The absolute sensor is a vision based landmark detector that computes the pose of the robot relative to a pre-mapped visual beacon. This would be a simple sensor fusion problem, which could be solved using standard recursive estimators, if we would not have considered two extra chara… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2008
2008
2018
2018

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 12 publications
(16 reference statements)
0
4
0
Order By: Relevance
“…In [18], a localization system using a particle filter and a monocular camera was presented. The Scale-Invariant Feature Transform (SIFT) signature of the images in a database was used for comparison with the present image.…”
Section: Related Workmentioning
confidence: 99%
“…In [18], a localization system using a particle filter and a monocular camera was presented. The Scale-Invariant Feature Transform (SIFT) signature of the images in a database was used for comparison with the present image.…”
Section: Related Workmentioning
confidence: 99%
“…The algorithm approaches near real-time. In [11] they combine visual odometry with wheel based odometry to get a relative position. SIFT is used for landmark detection to get an absolute position.…”
Section: Related Workmentioning
confidence: 99%
“…For example, GPS loses coverage when the vehicle does not have a full sky vision [ 9 ]. As another examples, if we introduce beacons, we need to structure the whole environment, or a laser needs available features to recognize in the environment, and these features should be in the range of action of the laser (typically between 10–20 m) [ 10 ].…”
Section: Introductionmentioning
confidence: 99%