-Humans effortlessly estimate positions of nearby objects in real time purely based on visual perception; a capability that is desirable for many real world robotic scenarios, such as a mobile robot approaching a target, or a robot arm reaching for human-placed objects. Today, such vision based object tracking requires significant computational efforts even for clearly marked objects, because of challenges in real time processing of huge amounts of -mainly redundantimage data (e.g. handling unknown illuminations, disentangling objects from cluttered background). Small autonomous robots typically cannot provide sufficient on-board processing power for visual object tracking. This paper presents a biologically inspired miniature sensor system for real time visual object tracking at rates of several 100 Hz, while utilizing only minimal computing resources. The system combines two functionally separate components: (I) a recently developed Dynamic Vision Sensor Chip (DVS), which -instead of transmitting full image frames at fixed time intervals -asynchronously emits "spike events" that are caused by temporal changes of illumination at individual pixels. Such biologically inspired information encoding drastically reduces the amount of data to be processed compared to traditional video cameras, and significantly increases time resolution. The other component (II) is a 32bit 64MHz microcontroller with 64KB on-board SRAM, which executes an event-based algorithm to track marked objects (here high-frequency flashing LEDs) in real-time, based on the DVS' output stream of spike events. The complete miniature sensor system requires less than 200mW power to autonomously track markers in real-time at well above 100Hz update rates, for a cost below 10US$ if produced in large quantities. This paper presents the asynchronous event-based tracking algorithm and evaluates the sensor system's performance in real world robotics scenarios.
Abstract.Following an object's position relative to oneself is a fundamental functionality required in intelligent real-world interacting robotic systems. This paper presents a computationally efficient vision based 3D tracking system, which can ultimately operate in real-time on autonomous mobile robots in cluttered environments. At the core of the system, two neural inspired eventbased dynamic vision sensors (eDVS) independently track a high frequency flickering LED in their respective 2D angular coordinate frame. A self-adjusted feed-forward neural network maps those independent 2D angular coordinates into a Cartesian 3D position in world coordinates. During an initial calibration phase, an object composed of multiple independent markers with known geometry provides relative position information between those markers for network training (without ever using absolute world coordinates for training). In a subsequent application phase tracking a single marker yields position estimates relative to sensor origin, while tracking multiple markers provides additional orientation. The neural inspired vision-based tracking system runs in real-time on ARM7 microcontrollers, without the need for an external PC.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.