2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2020
DOI: 10.1109/iros45743.2020.9341208
|View full text |Cite
|
Sign up to set email alerts
|

IDOL: A Framework for IMU-DVS Odometry using Lines

Abstract: In this paper, we introduce IDOL, an optimization-based framework for IMU-DVS Odometry using Lines. Event cameras, also called Dynamic Vision Sensors (DVSs), generate highly asynchronous streams of events triggered upon illumination changes for each individual pixel. This novel paradigm presents advantages in low illumination conditions and high-speed motions. Nonetheless, this unconventional sensing modality brings new challenges to perform scene reconstruction or motion estimation. The proposed method offers… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
16
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 22 publications
(16 citation statements)
references
References 34 publications
0
16
0
Order By: Relevance
“…Correctness Instance Persistence Hough transf. [9], [10], [11] + -+ Non parametric [12], [13] + --Spatio-temporal [14], [15] + + -Ours + + + and frequent changes in appearance in the stream of the events.…”
Section: Methods Categorymentioning
confidence: 99%
See 2 more Smart Citations
“…Correctness Instance Persistence Hough transf. [9], [10], [11] + -+ Non parametric [12], [13] + --Spatio-temporal [14], [15] + + -Ours + + + and frequent changes in appearance in the stream of the events.…”
Section: Methods Categorymentioning
confidence: 99%
“…The difficulty of line tracking using events led previous works [9], [10], [11], [12], [13], [14], [15] to explore different techniques but pursuing a common goal: robustness. We identify the robustness of tracking a line as the combination of three desired characteristics.…”
Section: Methods Categorymentioning
confidence: 99%
See 1 more Smart Citation
“…A promising idea to realise asynchronous event-driven VO is to estimate a continuous-time (CT) camera trajectory from E. This implicitly allows the camera pose to be temporally interpolated and extrapolated, which in turn provides a basis to perform P1 and P2. Along the lines above, there have been efforts on event-driven VO [7], [8], [9], [10]. However, previous works either assume a known map [7], [8], restrict the camera to planar motions [9], or incorporate an IMU with preintegration [11] thereby achieving event-inertial VO [10].…”
Section: Introductionmentioning
confidence: 99%
“…Along the lines above, there have been efforts on event-driven VO [7], [8], [9], [10]. However, previous works either assume a known map [7], [8], restrict the camera to planar motions [9], or incorporate an IMU with preintegration [11] thereby achieving event-inertial VO [10].…”
Section: Introductionmentioning
confidence: 99%