2011
DOI: 10.1111/j.1365-2818.2011.03565.x
|View full text |Cite
|
Sign up to set email alerts
|

Asynchronous event‐based high speed vision for microparticle tracking

Abstract: This paper presents a new high speed vision system using an asynchronous address-event representation camera. Within this framework, an asynchronous event-based real-time Hough circle transform is developed to track microspheres. The technology presented in this paper allows for a robust realtime event-based multiobject position detection at a frequency of several kHz with a low computational cost. Brownian motion is also detected within this context with both high speed and precision. The carried-out work is … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
58
0

Year Published

2012
2012
2019
2019

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 86 publications
(58 citation statements)
references
References 21 publications
0
58
0
Order By: Relevance
“…Its pixels detect scenes' contrast changes rather than absolute grey level values and send their own data as local information independently and asynchronously [14]. The dedicated algorithms are thus frame free and can be operated at high speeds at the order of several kilohertz [15] [16].…”
Section: Introductionmentioning
confidence: 99%
“…Its pixels detect scenes' contrast changes rather than absolute grey level values and send their own data as local information independently and asynchronously [14]. The dedicated algorithms are thus frame free and can be operated at high speeds at the order of several kilohertz [15] [16].…”
Section: Introductionmentioning
confidence: 99%
“…The use of event-based retinas requires the development of time-oriented event-based algorithms, in order to benefit fully from the properties of this new framework [19]. Neural shape coding is a difficult issue as there is almost an infinite number of representations of shapes in the real world.…”
Section: Introductionmentioning
confidence: 99%
“…This paper is based on several previous papers. In [19], DVS was used to develop an event-based Hough transformation to track specifically circles. Hough transformations rely on a voting scheme and maximum detection within the accumulation spaces to identify the location of a shape.…”
Section: Introductionmentioning
confidence: 99%
“…The first pick-and-place manipulation with 3D haptic feedback using a microgripper is successfully achieved. This work is based on several previous studies on the stability of haptic coupling schemes for applications at microscales [26], on the definition of virtual guides for pick-and-place operation of microspheres based on two atomic force microscopy cantilevers [27], on the use of visual feedback coming from a scanning electron microscope for teleoperation [28] and on the use of dynamic vision sensors [29]. However, it is the first time, to our knowledge, that dynamic vision sensors are used to provide stable haptic feedback to assist users while performing a pick-and-place operation.…”
Section: Introductionmentioning
confidence: 99%