2006 IEEE 12th Digital Signal Processing Workshop &Amp;amp; 4th IEEE Signal Processing Education Workshop 2006
DOI: 10.1109/dspws.2006.265448
|View full text |Cite
|
Sign up to set email alerts
|

Embedded Vision System for Real-Time Object Tracking using an Asynchronous Transient Vision Sensor

Abstract: This paper presents an embedded vision system for object tracking applications based on a 128×128 pixel CMOS temporal contrast vision sensor. This imager asynchronously responds to relative illumination intensity changes in the visual scene, exhibiting a usable dynamic range of 120dB and a latency of under 100µs. The information is encoded in the form of Address-Event Representation (AER) data. An algorithm for object tracking with 1 millisecond timestamp resolution of the AER data stream is presented. As a re… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
106
0

Year Published

2010
2010
2019
2019

Publication Types

Select...
6
2
2

Relationship

0
10

Authors

Journals

citations
Cited by 124 publications
(106 citation statements)
references
References 10 publications
0
106
0
Order By: Relevance
“…[38][39][40][41]) and high speed robotics [42,43]. Eight papers on event based sensing have been accepted to the very competitive IEEE International Solid State Circuits Conference since 2003 (two on auditory sensors [44,45] and six on vision sensors [9,16,27,33,46,47]), showing that this approach is starting to impact mainstream electronics.…”
Section: Resultsmentioning
confidence: 99%
“…[38][39][40][41]) and high speed robotics [42,43]. Eight papers on event based sensing have been accepted to the very competitive IEEE International Solid State Circuits Conference since 2003 (two on auditory sensors [44,45] and six on vision sensors [9,16,27,33,46,47]), showing that this approach is starting to impact mainstream electronics.…”
Section: Resultsmentioning
confidence: 99%
“…Since they assume that the background is constant and the system reacts only to changes, their system cannot be applied in a dynamic environment. The same goes for the work of Litzenbeger et al [22] which used this method to track pedestrians and vehicles.…”
Section: State Of the Artmentioning
confidence: 81%
“…Thousands of events are triggered in the time between two frames since, due to the sensor's motion, intensity changes occur at all pixels. tracked moving objects as clustered blob-like sources of events [11], [10]. The high-speed advantage of event cameras was also shown in [12] for a pencil-balancing robot.…”
Section: A Event-based Feature Detection and Trackingmentioning
confidence: 87%