The booming interest in Unmanned Aerial Vehicles (UAVs) is fed by their potentially great impact, however progress is hindered by their limited perception capabilities. While vision-based odometry was shown to run successfully onboard UAVs, loop-closure detection to correct for drift or to recover from tracking failures, has so far, proven particularly challenging for UAVs. At the heart of this is the problem of viewpoint-tolerant place recognition; in stark difference to ground robots, UAVs can revisit a scene from very different viewpoints. As a result, existing approaches struggle greatly as the task at hand violates underlying assumptions in assessing scene similarity. In this paper, we propose a place recognition framework, which exploits both efficient binary features and noisy estimates of the local 3D geometry, which are anyway computed for visual-inertial odometry onboard the UAV. Attaching both an appearance and a geometry signature to each 'location', the proposed approach demonstrates unprecedented recall for perfect precision as well as high quality loopclosing transformations on both flying and hand-held datasets exhibiting large viewpoint and appearance changes as well as perceptual aliasing.
Video-https://youtu.be/8VkR_nSbR34 Datasets-http://www.v4rl.ethz.ch/research/ datasets-code.html • a new, carefully designed place recognition pipeline especially developed for robot navigation, which avoids