2019
DOI: 10.1007/978-3-030-27544-0_3
|View full text |Cite
|
Sign up to set email alerts
|

Visual SLAM-Based Localization and Navigation for Service Robots: The Pepper Case

Abstract: We propose a Visual-SLAM based localization and navigation system for service robots. Our system is built on top of the ORB-SLAM monocular system but extended by the inclusion of wheel odometry in the estimation procedures. As a case study, the proposed system is validated using the Pepper robot, whose short-range LIDARs and RGB-D camera do not allow the robot to self-localize in large environments. The localization system is tested in navigation tasks using Pepper in two different environments: a medium-size … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 15 publications
0
3
0
Order By: Relevance
“…Alhmiedat et al [20] highlighted the problems with Pepper's navigation system. Gomez et al [21] mentioned that due to the small-range LIDARs and RGB, Pepper is not able to self-localize in larger surroundings. Silva et al [22] studied Pepper's navigation with and without obstacle avoidance and illustrated that without obstacles, its success rate is higher.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Alhmiedat et al [20] highlighted the problems with Pepper's navigation system. Gomez et al [21] mentioned that due to the small-range LIDARs and RGB, Pepper is not able to self-localize in larger surroundings. Silva et al [22] studied Pepper's navigation with and without obstacle avoidance and illustrated that without obstacles, its success rate is higher.…”
Section: Related Workmentioning
confidence: 99%
“…Although promising results were reported, the studies were conducted in a constrained space, suggesting additional research. Previous studies have described the improvement in Pepper's existing functionality, such as improving locomotion using control theory [31], 3D depth perception [32,33], and another using a combination of monocular perception and the in-built 3D sensor, navigation, and localization using the Robot operating system (ROS) [34], ORB SLAM [21], and an improved version of ORB SLAM 2 [23].…”
Section: Map and Navigation Functionality Of Peppermentioning
confidence: 99%
“…A suggestion to use more than the ranger sensors is made as future work objective in [ 18 ]. This suggestion is followed by the work in [ 19 ] which uses the RGB and depth sensors of Pepper to perform visual SLAM (specifically, an adapted version of the ORB-SLAM algorithm [ 20 ]), using the range sensors only for obstacle avoidance during navigation. However, in this case SLAM takes longer to complete and the approach has difficulty in estimating a correct localization in visually feature-poor environments (e.g., a large hall), given the narrow field of view of the RGB camera installed on Pepper.…”
Section: Related Workmentioning
confidence: 99%