Unmanned aerial vehicles like drones are one of the key development technologies with many beneficial applications. As they have made great progress, security and privacy issues are also growing. Drone tacking with a moving camera is one of the important methods to solve these issues. There are various challenges of drone tracking. First, drones move quickly and are usually tiny. Second, images captured by a moving camera have illumination changes. Moreover, the tracking should be performed in real-time for surveillance applications. For fast and accurate drone tracking, this paper proposes a tracking framework utilizing two trackers, a predictor, and a refinement process. One tracker finds a moving target based on motion flow and the other tracker locates the region of interest (ROI) employing histogram features. The predictor estimates the trajectory of the target by using a Kalman filter. The predictor contributes to keeping track of the target even if the trackers fail. Lastly, the refinement process decides the location of the target taking advantage of ROIs from the trackers and the predictor. In experiments on our dataset containing tiny flying drones, the proposed method achieved an average success rate of 1.134 times higher than conventional tracking methods and it performed at an average run-time of 21.08 frames per second.
Tracking unmanned aerial vehicles (UAVs) in outdoor scenes poses significant challenges due to their dynamic motion, diverse sizes, and changes in appearance. This paper proposes an efficient hybrid tracking method for UAVs, comprising a detector, tracker, and integrator. The integrator combines detection and tracking, and updates the target’s features online while tracking, thereby addressing the aforementioned challenges. The online update mechanism ensures robust tracking by handling object deformation, diverse types of UAVs, and changes in background. We conducted experiments on custom and public UAV datasets to train the deep learning-based detector and evaluate the tracking methods, including the commonly used UAV123 and UAVL datasets, to demonstrate generalizability. The experimental results show the effectiveness and robustness of our proposed method under challenging conditions, such as out-of-view and low-resolution scenarios, and demonstrate its performance in UAV detection tasks.
This paper proposes a point cloud (PC) visual quality assessment (VQA) framework that reflects the human visual system (HVS). The proposed framework compares natural images acquired using a digital camera and PC images generated via 2D projection in terms of appropriate objective quality evaluation metrics. Humans primarily consume natural images; thus, human knowledge is typically formed from natural images. Thus, natural images can be more reliable reference data than PC data. The proposed framework performs an image alignment process based on feature matching and image warping to use the natural images as a reference which enhances the similarities of the acquired natural and corresponding PC images. The framework facilitates identifying which objective VQA metrics can be used to reflect the HVS effectively. We constructed a database of natural images and three PC image qualities, and objective and subjective VQAs were conducted. The experimental result demonstrates that the acceptable consistency among different PC qualities appears in the metrics that compare the global structural similarity of images. We found that the SSIM, MAD, and GMSD achieved remarkable Spearman rank-order correlation coefficient scores of 0.882, 0.871, and 0.930, respectively. Thus, the proposed framework can reflect the HVS by comparing the global structural similarity between PC and natural reference images.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.