Landing is a challenging aspect of flight because, to land safely, speed must be decreased to a value close to zero at touchdown. The mechanisms by which animals achieve this remain unclear. When landing on horizontal surfaces, honey bees control their speed by holding constant the rate of front-to-back image motion (optic flow) generated by the surface as they reduce altitude. As inclination increases, however, this simple pattern of optic flow becomes increasingly complex. How do honey bees control speed when landing on surfaces that have different orientations? To answer this, we analyze the trajectories of honey bees landing on a vertical surface that produces various patterns of motion. We find that landing honey bees control their speed by holding the rate of expansion of the image constant. We then test and confirm this hypothesis rigorously by analyzing landings when the apparent rate of expansion generated by the surface is manipulated artificially. This strategy ensures that speed is reduced, gradually and automatically, as the surface is approached. We then develop a mathematical model of this strategy and show that it can effectively be used to guide smooth landings on surfaces of any orientation, including horizontal surfaces. This biological strategy for guiding landings does not require knowledge about either the distance to the surface or the speed at which it is approached. The simplicity and generality of this landing strategy suggests that it is likely to be exploited by other flying animals and makes it ideal for implementation in the guidance systems of flying robots.rchestrating a safe landing is one of the greatest challenges for flying animals and airborne vehicles alike. Although some progress has been made toward unraveling the cues that flying animals might use for triggering landings (1-10), we do not yet have a good understanding of how these or other possible cues are used to control the landing process once it has been initiated.To achieve a smooth landing, it is essential to control deceleration in such a manner that the approach speed decreases to a value close to zero at the time of touchdown. An obvious way to achieve this would be to measure flight speed and distance to the target simultaneously and to use this information to reduce speed progressively, in a moment-to-moment fashion. However, this strategy is computationally demanding and unsuitable for animals such as flying insects, whose close-set, fixed-focus eyes prevent them from using stereopsis or accommodation to measure the distances to surfaces directly (11-13).When performing a grazing landing on a horizontal surface, honey bees use a technique that allows them to overcome the limitations of their relatively simple nervous systems. Instead of measuring the distance to the surface directly, they hold constant the magnitude of optic flow (the speed of image motion on the retina) that is generated by the ground beneath them (4, 5). This automatically ensures that the speed of flight is reduced as the ground is approac...
When the moon is absent from the night sky, stars remain as celestial visual cues. Nonetheless, only birds, seals, and humans are known to use stars for orientation. African ball-rolling dung beetles exploit the sun, the moon, and the celestial polarization pattern to move along straight paths, away from the intense competition at the dung pile. Even on clear moonless nights, many beetles still manage to orientate along straight paths. This led us to hypothesize that dung beetles exploit the starry sky for orientation, a feat that has, to our knowledge, never been demonstrated in an insect. Here, we show that dung beetles transport their dung balls along straight paths under a starlit sky but lose this ability under overcast conditions. In a planetarium, the beetles orientate equally well when rolling under a full starlit sky as when only the Milky Way is present. The use of this bidirectional celestial cue for orientation has been proposed for vertebrates, spiders, and insects, but never proven. This finding represents the first convincing demonstration for the use of the starry sky for orientation in insects and provides the first documented use of the Milky Way for orientation in the animal kingdom.
SUMMARYVisual landmarks guide humans and animals including insects to a goal location. Insects, with their miniature brains, have evolved a simple strategy to find their nests or profitable food sources; they approach a goal by finding a close match between the current view and a memorised retinotopic representation of the landmark constellation around the goal. Recent implementations of such a matching scheme use raw panoramic images ('image matching') and show that it is well suited to work on robots and even in natural environments. However, this matching scheme works only if relevant landmarks can be detected by their contrast and texture. Therefore, we tested how honeybees perform in localising a goal if the landmarks can hardly be distinguished from the background by such cues. We recorded the honeybees' flight behaviour with high-speed cameras and compared the search behaviour with computer simulations. We show that honeybees are able to use landmarks that have the same contrast and texture as the background and suggest that the bees use relative motion cues between the landmark and the background. These cues are generated on the eyes when the bee moves in a characteristic way in the vicinity of the landmarks. This extraordinary navigation performance can be explained by a matching scheme that includes snapshots based on optic flow amplitudes ('optic flow matching'). This new matching scheme provides a robust strategy for navigation, as it depends primarily on the depth structure of the environment.Supplementary material available online at http://jeb.biologists.org/cgi/content/full/213/17/2913/DC1 Key words: honeybee, landmark navigation, snapshot matching, vision. THE JOURNAL OF EXPERIMENTAL BIOLOGY 2914be unnecessary (Zeil et al., 2003;Stürzl and Zeil, 2007). Zeil et al. show that the similarities between panoramic images of natural environments decrease smoothly with spatial distance between an observer and the goal location (Zeil et al., 2003). An animal that is sensitive to the similarity of views relative to the memorised view of the goal location could return to this location by maximising the similarities between images [modelled by simple image similarity gradient methods (Zeil et al., 2003)]. Thus, panoramic image similarities can be used for view-based homing in natural environments. Recently, the behaviour of ants and crickets in goal-finding tasks could be explained by 'image matching ' (Wystrach and Beugnon, 2009;Mangan and Webb, 2009).In our combined behavioural and modelling approach, we tested the content of the spatial memory in honeybees during complex navigational tasks. Honeybees were trained to locate an inconspicuous feeder surrounded by three cylinders, which we refer to as landmarks. By altering the spatial configuration and landmark texture and monitoring the approach flights to the feeder, we addressed the following questions: what role does the spatial configuration of the landmarks play? Does landmark texture play a role in navigational tasks? In particular, can landmarks b...
The quality of visual information that is available to an animal is limited by the size of its eyes. Differences in eye size can be observed even between closely related individuals, yet we understand little about how this affects vision. Insects are good models for exploring the effects of size on visual systems because many insect species exhibit size polymorphism. Previous work has been limited by difficulties in determining the 3D structure of eyes. We have developed a novel method based on x-ray microtomography to measure the 3D structure of insect eyes and to calculate predictions of their visual capabilities. We used our method to investigate visual allometry in the bumblebee Bombus terrestris and found that size affects specific aspects of vision, including binocular overlap, optical sensitivity, and dorsofrontal visual resolution. This reveals that differential scaling between eye areas provides flexibility that improves the visual capabilities of larger bumblebees.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.