We present a system consisting of a miniature unmanned aerial vehicle (UAV) and a small carrier vehicle, in which the UAV is capable of autonomously starting from the moving ground vehicle, tracking it at a constant distance and landing on a platform on the carrier in motion.Our visual tracking approach differs from other methods by using low-cost, lightweight commodity consumer hardware. As main sensor we use a Wii remote infrared (IR) camera, which allows robust tracking of a pattern of IR lights in conditions without direct sunlight. The system does not need to communicate with the ground vehicle and works with an onboard 8-bit microcontroller. Nevertheless the position and orientation relative to the IR pattern is estimated at a frequency of approximately 50 Hz. This enables the UAV to fly fully autonomously, performing flight control, self-stabilisation and visual tracking of the ground vehicle.We present experiments in which our UAV performs autonomous flights with a moving ground carrier describing a circular path and where the carrier is rotating. The system provides small errors and allows for safe, autonomous indoor flights.
We present a system consisting of a miniature unmanned aerial vehicle (UAV) and a small carrier vehicle, in which the UAV is capable of autonomously starting from the moving ground vehicle, tracking it at a constant distance and landing on a platform on the carrier in motion.Our visual tracking approach differs from other methods by using low-cost, lightweight commodity consumer hardware. As main sensor we use a Wii remote infrared (IR) camera, which allows robust tracking of a pattern of IR lights in conditions without direct sunlight. The system does not need to communicate with the ground vehicle and works with an onboard 8-bit microcontroller. Nevertheless the position and orientation relative to the IR pattern is estimated at a frequency of approximately 50 Hz. This enables the UAV to fly fully autonomously, performing flight control, self-stabilisation and visual tracking of the ground vehicle.We present experiments in which our UAV performs autonomous flights with a moving ground carrier describing a circular path and where the carrier is rotating. The system provides small errors and allows for safe, autonomous indoor flights.
Abstract-Vision-based robot localization in outdoor environments is difficult because of changing illumination conditions. Another problem is the rough and cluttered environment which makes it hard to use visual features that are not rotation invariant. A popular method that is rotation invariant and relatively robust to changing illumination is the Scale Invariant Feature Transform (SIFT). However, due to the computationally intensive feature extraction and image matching, localization using SIFT is slow. On the other hand, techniques which use global image features are in general less robust and exact than SIFT, but are often much faster due to fast image matching. In this paper, we present a hybrid localization approach that switches between local and global image features. For most images, the hybrid approach uses fast global features. Only in difficult situations, e.g. containing strong illumination changes, the hybrid approach switches to local features. To decide which features to use for an image, we analyze the particle cloud of the particle filter that we use for position estimation. Experiments on outdoor images taken under varying illumination conditions show that the position estimates of the hybrid approach are about as exact as the estimates of SIFT alone. However, the average localization time using the hybrid approach is more than 3.5 times faster than using SIFT.
Abstract-We present a follow-the-leader scenario with a system of two small low-cost quadrocopters of different types and configurations.The leader is a Parrot AR.Drone which is controlled by an iPad App utilizing the visual odometry provided by the quadrocopter and pilots it autonomously. The follower is an Asctec Hummingbird which is controlled by an onboard 8-bit microcontroller. Neither communication nor external sensors are required. A custom-built pan/tilt unit and the camera of a Nintendo Wii remote tracks a pattern of infrared lights and allows for online pose estimation. A base station allows for monitoring the behavior but is not required for autonomous flights.Our efficient solution of the perspective-3-point problem allows for estimating the pose of the camera relative to the pattern in six degrees of freedom at a high frequency on the microcontroller. The presented experiments include a scenario in which the follower follows the leader with a constant distance of two meters flying different shapes in narrow, GPS-denied indoor environment.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.