Abstract:Fig. 1: This paper provides the complex urban data set including metropolitan area, apartment building complex and underground parking lot. Sample scenes from the data set can be found in https://youtu.be/IguZjmLf5V0.Abstract-This paper presents a Light Detection and Ranging (LiDAR) data set that targets complex urban environments. Urban environments with high-rise buildings and congested traffic pose a significant challenge for many robotics applications. The presented data set is unique in the sense it is ab… Show more
“…Indeed, as illustrated on Figure 2, when the wheels of a car actually stop, the car undergoes a rotational motion 1 Without loss of generality, we assume that the body frame is aligned with the IMU frame. Real IMU data of a car stopping from sequence urban06 of [11]. We see (9) holds and thus z VEL n = 1 at t = 5.8s while (10) does not hold yet.…”
Section: B Discussion On the Choice Of Profilesmentioning
confidence: 79%
“…The following results are obtained on the complex urban LiDAR dataset [11], that consists of data recorded on a consumer car moving in complex urban environments, e.g. metropolitan areas, large building complexes and underground parking lots, see Figure 6.…”
Section: Results On Car Datasetmentioning
confidence: 99%
“…For the level of precision we pursue in the present paper, distinguishing between (9) and (10) is pivotal since it allows us to: i) properly label motion profiles before training (see Section V-B, where we have different thresholds on position and on angular velocity); and: ii) improve detection accuracy since only one motion pattern can be identified as valid. (11) and (13) generally hold for robots moving indoors or cars on roads. Note that (13) is expressed in the body frame, and thus generally holds for a car moving on a road even if the road is not level.…”
Section: B Discussion On the Choice Of Profilesmentioning
confidence: 99%
“…Taking into account vehicle constraints and odometer measurements are known to increase the robustness of visual- Trajectory ground truth and estimates obtained by various methods: integration of IMU signals; odometry based on a differential wheel encoder system; odometry combined with an highly accurate and expensive Fiber optics Gyro (FoG) that provides orientation estimates; and the proposed RINS-W approach which considers only the IMU sensor embarked in the vehicle, and which outperforms the other schemes. The final distance error for this long-term sequence urban16 (73 minutes) of the car dataset [11] is 20 m for the RINS-W solution. The deep learning based detector (see Section IV-A) has of course not been trained or cross-validated on this sequence.…”
Section: A Related Workmentioning
confidence: 99%
“…• we demonstrate the performances of the approach on a publicly available car dataset [11] in Section V. Our approach solely based on the IMU produces accurate estimates with a final distance w.r.t. ground truth of 20 m on the 73 minutes test sequence urban16, see Figure 1.…”
This paper proposes a real-time approach for longterm inertial navigation based only on an Inertial Measurement Unit (IMU) for self-localizing wheeled robots. The approach builds upon two components: 1) a robust detector that uses recurrent deep neural networks to dynamically detect a variety of situations of interest, such as zero velocity or no lateral slip; and 2) a state-of-the-art Kalman filter which incorporates this knowledge as pseudo-measurements for localization. Evaluations on a publicly available car dataset demonstrates that the proposed scheme may achieve a final precision of 20 m for a 21 km long trajectory of a vehicle driving for over an hour, equipped with an IMU of moderate precision (the gyro drift rate is 10 deg/h). To our knowledge, this is the first paper which combines sophisticated deep learning techniques with state-ofthe-art filtering methods for pure inertial navigation on wheeled vehicles and as such opens up for novel data-driven inertial navigation techniques. Moreover, albeit taylored for IMU-only based localization, our method may be used as a component for self-localization of wheeled robots equipped with a more complete sensor suite.
“…Indeed, as illustrated on Figure 2, when the wheels of a car actually stop, the car undergoes a rotational motion 1 Without loss of generality, we assume that the body frame is aligned with the IMU frame. Real IMU data of a car stopping from sequence urban06 of [11]. We see (9) holds and thus z VEL n = 1 at t = 5.8s while (10) does not hold yet.…”
Section: B Discussion On the Choice Of Profilesmentioning
confidence: 79%
“…The following results are obtained on the complex urban LiDAR dataset [11], that consists of data recorded on a consumer car moving in complex urban environments, e.g. metropolitan areas, large building complexes and underground parking lots, see Figure 6.…”
Section: Results On Car Datasetmentioning
confidence: 99%
“…For the level of precision we pursue in the present paper, distinguishing between (9) and (10) is pivotal since it allows us to: i) properly label motion profiles before training (see Section V-B, where we have different thresholds on position and on angular velocity); and: ii) improve detection accuracy since only one motion pattern can be identified as valid. (11) and (13) generally hold for robots moving indoors or cars on roads. Note that (13) is expressed in the body frame, and thus generally holds for a car moving on a road even if the road is not level.…”
Section: B Discussion On the Choice Of Profilesmentioning
confidence: 99%
“…Taking into account vehicle constraints and odometer measurements are known to increase the robustness of visual- Trajectory ground truth and estimates obtained by various methods: integration of IMU signals; odometry based on a differential wheel encoder system; odometry combined with an highly accurate and expensive Fiber optics Gyro (FoG) that provides orientation estimates; and the proposed RINS-W approach which considers only the IMU sensor embarked in the vehicle, and which outperforms the other schemes. The final distance error for this long-term sequence urban16 (73 minutes) of the car dataset [11] is 20 m for the RINS-W solution. The deep learning based detector (see Section IV-A) has of course not been trained or cross-validated on this sequence.…”
Section: A Related Workmentioning
confidence: 99%
“…• we demonstrate the performances of the approach on a publicly available car dataset [11] in Section V. Our approach solely based on the IMU produces accurate estimates with a final distance w.r.t. ground truth of 20 m on the 73 minutes test sequence urban16, see Figure 1.…”
This paper proposes a real-time approach for longterm inertial navigation based only on an Inertial Measurement Unit (IMU) for self-localizing wheeled robots. The approach builds upon two components: 1) a robust detector that uses recurrent deep neural networks to dynamically detect a variety of situations of interest, such as zero velocity or no lateral slip; and 2) a state-of-the-art Kalman filter which incorporates this knowledge as pseudo-measurements for localization. Evaluations on a publicly available car dataset demonstrates that the proposed scheme may achieve a final precision of 20 m for a 21 km long trajectory of a vehicle driving for over an hour, equipped with an IMU of moderate precision (the gyro drift rate is 10 deg/h). To our knowledge, this is the first paper which combines sophisticated deep learning techniques with state-ofthe-art filtering methods for pure inertial navigation on wheeled vehicles and as such opens up for novel data-driven inertial navigation techniques. Moreover, albeit taylored for IMU-only based localization, our method may be used as a component for self-localization of wheeled robots equipped with a more complete sensor suite.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.