2020 IEEE International Conference on Robotics and Automation (ICRA) 2020
DOI: 10.1109/icra40945.2020.9197341
|View full text |Cite
|
Sign up to set email alerts
|

Asynchronous event-based clustering and tracking for intrusion monitoring in UAS

Abstract: Automatic surveillance and monitoring using Unmanned Aerial Systems (UAS) require the development of perception systems that robustly work under different illumination conditions. Event cameras are neuromorphic sensors that capture the illumination changes in the scene with very low latency and high dynamic range. Although recent advances in eventbased vision have explored the use of event cameras onboard UAS, most techniques group events in frames and, therefore, do not fully exploit the sequential and asynch… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
27
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 31 publications
(27 citation statements)
references
References 29 publications
0
27
0
Order By: Relevance
“…In addition the discussed datasets above, Table 3 presents a view for various multi-sensors agent navigation datasets, including UZH-FPV Drone Racing [ 125 ], TUM RGB-D Dataset [ 126 ], ScanNet [ 127 ], NYU V2 [ 128 ], InteriorNet [ 129 ], SceneNet RGB-D [ 130 ], and others [ 131 , 132 , 133 , 134 , 135 , 136 , 137 , 138 , 139 , 140 , 141 , 142 , 143 , 144 ], etc. These datasets provide the basic requirements of simulation and evaluation of multi-sensor fusion in experiments.…”
Section: Multi-modal Datasetsmentioning
confidence: 99%
“…In addition the discussed datasets above, Table 3 presents a view for various multi-sensors agent navigation datasets, including UZH-FPV Drone Racing [ 125 ], TUM RGB-D Dataset [ 126 ], ScanNet [ 127 ], NYU V2 [ 128 ], InteriorNet [ 129 ], SceneNet RGB-D [ 130 ], and others [ 131 , 132 , 133 , 134 , 135 , 136 , 137 , 138 , 139 , 140 , 141 , 142 , 143 , 144 ], etc. These datasets provide the basic requirements of simulation and evaluation of multi-sensor fusion in experiments.…”
Section: Multi-modal Datasetsmentioning
confidence: 99%
“…such as visual servoing [19], motion segmentation [20], surveillance tasks [21], robot localization [22], and onboard computation load management [23], among many others. A number of datasets for event-based vision have been presented exploring the advantages of event cameras onboard aerial robots [13], [14], [21], [24]. The works in [21] and [24] provide sequences recorded onboard multirotors used to evaluate event-based methods for tracking moving objects.…”
Section: Related Workmentioning
confidence: 99%
“…The advantages of event cameras have motivated increasing research interest of the robotics and computer vision communities [3] [17]. A wide variety of methods have been used for feature extraction [18], clustering [19], tracking [20], optical flow computation [21], detect objects in motion [22], and SLAM [23], among many others. A full review of event processing methods can be found in [3].…”
Section: State Of the Artmentioning
confidence: 99%
“…To ensure the statistical consistency of the EKF, event-by-event processing and event-image processing use different events. The events received from the camera are randomly sampled as in [22]. γ e and γ i are the percentages of the input events that are sent to the event-by-event and the image-event processing modules, respectively.…”
Section: Asynchronous Event-based Line Trackingmentioning
confidence: 99%