2019
DOI: 10.1016/j.robot.2018.11.019
|View full text |Cite
|
Sign up to set email alerts
|

Precise localization of the mobile wheeled robot using sensor fusion of odometry, visual artificial landmarks and inertial sensors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
10
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 27 publications
(11 citation statements)
references
References 7 publications
0
10
0
1
Order By: Relevance
“…The design of the current odometer has mostly been applied to wheeled robots, whose mileage data are obtained by measuring the rotation of the wheel using a built-in encoder to achieve accurate positioning information. However, the wheel odometer has been shown as idle on smooth ground and cannot measure wheel "skidding", which adversely affects the positioning accuracy [9]. However, as opposed to wheeled robots, legged robots are driven by motorized leg joints, which cannot provide accurate mileage data when relying solely on its own structure.…”
Section: Introductionmentioning
confidence: 99%
“…The design of the current odometer has mostly been applied to wheeled robots, whose mileage data are obtained by measuring the rotation of the wheel using a built-in encoder to achieve accurate positioning information. However, the wheel odometer has been shown as idle on smooth ground and cannot measure wheel "skidding", which adversely affects the positioning accuracy [9]. However, as opposed to wheeled robots, legged robots are driven by motorized leg joints, which cannot provide accurate mileage data when relying solely on its own structure.…”
Section: Introductionmentioning
confidence: 99%
“…To effectively perform these complex open-field tasks, accurate and precise localization of these agricultural robots, including XYZ position, heading, and attitude (roll and pitch tilts), are essential. Several principles of localization sensors for agricultural robots are real-time kinematic global navigation satellite systems (RTK GNSS) [1]- [5], landmarks detection [6], light laser detection and ranging (LiDAR) [7]- [8], radar [9], sensor fusion with camera [10]- [13], ultrasonic [14], multi-sensor fusion [15]- [16], beacons based [17], localization and mapping (SLAM) [18], laser range finder (LRF) [19], and inertial measurement unit (IMU) [20]. Particularly, RTK GNSS coupled with inertial navigation systems (INS) is extensively employed by farmers due to its relatively higher accuracy and precision compared to other sensing techniques.…”
Section: Introductionmentioning
confidence: 99%
“…For example, using a GPS along with visual and laser-based sensors is one of the traditional sensor fusion cases to locate an outdoor vehicle. 6 Some nonlinear approaches have been proposed to attain proper sensor fusion and data reconstruction. 7 Kolanowski et al.…”
Section: Introductionmentioning
confidence: 99%