The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2018
DOI: 10.1007/978-3-030-01258-8_46
|View full text |Cite
|
Sign up to set email alerts
|

Asynchronous, Photometric Feature Tracking Using Events and Frames

Abstract: We present a method that leverages the complementarity of event cameras and standard cameras to track visual features with lowlatency. Event cameras are novel sensors that output pixel-level brightness changes, called "events". They offer significant advantages over standard cameras, namely a very high dynamic range, no motion blur, and a latency in the order of microseconds. However, because the same scene pattern can produce different events depending on the motion direction, establishing event correspondenc… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
71
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
5

Relationship

5
5

Authors

Journals

citations
Cited by 113 publications
(77 citation statements)
references
References 36 publications
0
71
0
Order By: Relevance
“…Early approaches did not reconstruct videos, but focused on the reconstruction of a single image from a large set of events collected by an event camera moving through a static scene. These works exploit the fact that every event provides one equation relating the intensity gradient and optic flow through brightness constancy [15]. Cook et al [10] used bio-inspired, interconnected networks to simultaneously recover intensity images, optic flow, and angular velocity from an event camera performing small rotations.…”
Section: Related Workmentioning
confidence: 99%
“…Early approaches did not reconstruct videos, but focused on the reconstruction of a single image from a large set of events collected by an event camera moving through a static scene. These works exploit the fact that every event provides one equation relating the intensity gradient and optic flow through brightness constancy [15]. Cook et al [10] used bio-inspired, interconnected networks to simultaneously recover intensity images, optic flow, and angular velocity from an event camera performing small rotations.…”
Section: Related Workmentioning
confidence: 99%
“…Event cameras such as the DAVIS and DVS [3,17] report log intensity changes, inspired by human vision. Although several works try to explore the advantages of the high temporal resolution provided by event cameras [39,13,26,41,40,8,15], how to make the best use of the event camera has not yet been fully investigated.…”
Section: Related Workmentioning
confidence: 99%
“…In contrast, methods operating on event packets trade-off latency for computational efficiency and performance. Despite their differences, both paradigms have been successfully applied on various vision tasks, including tracking [19,21,40,42], depth estimation [3,52,67], visual odometry [27,54,57,66], recognition [29,44], and optical flow estimation [7,69]. A good survey on the applications of event cameras can be found in [18].…”
Section: Introductionmentioning
confidence: 99%