Fast moving animals depend on cues derived from the optic flow on their retina. Optic flow from translational locomotion includes information about the three-dimensional composition of the environment, while optic flow experienced during a rotational self motion does not. Thus, a saccadic gaze strategy that segregates rotations from translational movements during locomotion will facilitate extraction of spatial information from the visual input. We analysed whether birds use such a strategy by highspeed video recording zebra finches from two directions during an obstacle avoidance task. Each frame of the recording was examined to derive position and orientation of the beak in three-dimensional space. The data show that in all flights the head orientation was shifted in a saccadic fashion and was kept straight between saccades. Therefore, birds use a gaze strategy that actively stabilizes their gaze during translation to simplify optic flow based navigation. This is the first evidence of birds actively optimizing optic flow during flight.
Changes in flight direction in flying insects are largely due to roll, yaw and pitch rotations of their body. Head orientation is stabilized for most of the time by counter rotation. Here, we use high-speed video to analyse head- and body-movements of the bumblebee Bombus terrestris while approaching and departing from a food source located between three landmarks in an indoor flight-arena. The flight paths consist of almost straight flight segments that are interspersed with rapid turns. These short and fast yaw turns (“saccades”) are usually accompanied by even faster head yaw turns that change gaze direction. Since a large part of image rotation is thereby reduced to brief instants of time, this behavioural pattern facilitates depth perception from visual motion parallax during the intersaccadic intervals. The detailed analysis of the fine structure of the bees’ head turning movements shows that the time course of single head saccades is very stereotypical. We find a consistent relationship between the duration, peak velocity and amplitude of saccadic head movements, which in its main characteristics resembles the so-called "saccadic main sequence" in humans. The fact that bumblebee head saccades are highly stereotyped as in humans, may hint at a common principle, where fast and precise motor control is used to reliably reduce the time during which the retinal images moves.
Bees use visual memories to find the spatial location of previously learnt food sites. Characteristic learning flights help acquiring these memories at newly discovered foraging locations where landmarks—salient objects in the vicinity of the goal location—can play an important role in guiding the animal's homing behavior. Although behavioral experiments have shown that bees can use a variety of visual cues to distinguish objects as landmarks, the question of how landmark features are encoded by the visual system is still open. Recently, it could be shown that motion cues are sufficient to allow bees localizing their goal using landmarks that can hardly be discriminated from the background texture. Here, we tested the hypothesis that motion sensitive neurons in the bee's visual pathway provide information about such landmarks during a learning flight and might, thus, play a role for goal localization. We tracked learning flights of free-flying bumblebees (Bombus terrestris) in an arena with distinct visual landmarks, reconstructed the visual input during these flights, and replayed ego-perspective movies to tethered bumblebees while recording the activity of direction-selective wide-field neurons in their optic lobe. By comparing neuronal responses during a typical learning flight and targeted modifications of landmark properties in this movie we demonstrate that these objects are indeed represented in the bee's visual motion pathway. We find that object-induced responses vary little with object texture, which is in agreement with behavioral evidence. These neurons thus convey information about landmark properties that are useful for view-based homing.
Keywords: bees features geometry homing learning orientation snapshot matching space perception view-based navigation vision How do bees employ multiple visual cues for homing? They could either combine the available cues using a view-based computational mechanism or pick one cue. We tested these strategies by training honeybees, Apis mellifera carnica, and bumblebees, Bombus terrestris, to locate food in one of the four corners of a box-shaped flight arena, providing multiple and also ambiguous cues. In tests, bees confused the diagonally opposite corners, which looked the same from the inside of the box owing to its rectangular shape and because these corners carried the same local colour cues. These 'rotational errors' indicate that the bees did not use compass information inferred from the geomagnetic field under our experimental conditions. When we then swapped cues between corners, bees preferred corners that had local cues similar to the trained corner, even when the geometric relations were incorrect. Apparently, they relied on views, a finding that we corroborated by computer simulations in which we assumed that bees try to match a memorized view of the goal location with the current view when they return to the box. However, when extra visual cues outside the box were provided, bees were able to resolve the ambiguity and locate the correct corner. We show that this performance cannot be explained by view matching from inside the box. Indeed, the bees adapted their behaviour and actively acquired information by leaving the arena and flying towards the cues outside the box. From there they re-entered the arena at the correct corner, now ignoring local cues that previously dominated their choices. All individuals of both species came up with this new behavioural strategy for solving the problem provided by the local ambiguity within the box. Thus both species seemed to be solving the ambiguous task by using their route memory, which is always available during their natural foraging behaviour. Ó
Sociality is classified as one of the major transitions in evolution, with the largest number of eusocial species found in the insect order Hymenoptera, including the Apini (honey bees) and the Bombini (bumble bees). Bumble bees and honey bees not only differ in their social organization and foraging strategies, but comparative analyses of their genomes demonstrated that bumble bees have a slightly less diverse family of olfactory receptors than honey bees, suggesting that their olfactory abilities have adapted to different social and/or ecological conditions. However, unfortunately, no precise comparison of olfactory coding has been performed so far between honey bees and bumble bees, and little is known about the rules underlying olfactory coding in the bumble bee brain. In this study, we used in vivo calcium imaging to study olfactory coding of a panel of floral odorants in the antennal lobe of the bumble bee Bombus terrestris. Our results show that odorants induce reproducible neuronal activity in the bumble bee antennal lobe. Each odorant evokes a different glomerular activity pattern revealing this molecule’s chemical structure, i.e. its carbon chain length and functional group. In addition, pairwise similarity among odor representations are conserved in bumble bees and honey bees. This study thus suggests that bumble bees, like honey bees, are equipped to respond to odorants according to their chemical features.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.