Path integration is a widespread navigational strategy in which directional changes and distance covered are continuously integrated on an outward journey, enabling a straight-line return to home. Bees use vision for this task-a celestial-cue-based visual compass and an optic-flow-based visual odometer-but the underlying neural integration mechanisms are unknown. Using intracellular electrophysiology, we show that polarized-light-based compass neurons and optic-flow-based speed-encoding neurons converge in the central complex of the bee brain, and through block-face electron microscopy, we identify potential integrator cells. Based on plausible output targets for these cells, we propose a complete circuit for path integration and steering in the central complex, with anatomically identified neurons suggested for each processing step. The resulting model circuit is thus fully constrained biologically and provides a functional interpretation for many previously unexplained architectural features of the central complex. Moreover, we show that the receptive fields of the newly discovered speed neurons can support path integration for the holonomic motion (i.e., a ground velocity that is not precisely aligned with body orientation) typical of bee flight, a feature not captured in any previously proposed model of path integration. In a broader context, the model circuit presented provides a general mechanism for producing steering signals by comparing current and desired headings-suggesting a more basic function for central complex connectivity, from which path integration may have evolved.
The deep sea is the largest habitat on earth. Its three great faunal environments--the twilight mesopelagic zone, the dark bathypelagic zone and the vast flat expanses of the benthic habitat--are home to a rich fauna of vertebrates and invertebrates. In the mesopelagic zone (150-1000 m), the down-welling daylight creates an extended scene that becomes increasingly dimmer and bluer with depth. The available daylight also originates increasingly from vertically above, and bioluminescent point-source flashes, well contrasted against the dim background daylight, become increasingly visible. In the bathypelagic zone below 1000 m no daylight remains, and the scene becomes entirely dominated by point-like bioluminescence. This changing nature of visual scenes with depth--from extended source to point source--has had a profound effect on the designs of deep-sea eyes, both optically and neurally, a fact that until recently was not fully appreciated. Recent measurements of the sensitivity and spatial resolution of deep-sea eyes--particularly from the camera eyes of fishes and cephalopods and the compound eyes of crustaceans--reveal that ocular designs are well matched to the nature of the visual scene at any given depth. This match between eye design and visual scene is the subject of this review. The greatest variation in eye design is found in the mesopelagic zone, where dim down-welling daylight and bio-luminescent point sources may be visible simultaneously. Some mesopelagic eyes rely on spatial and temporal summation to increase sensitivity to a dim extended scene, while others sacrifice this sensitivity to localise pinpoints of bright bioluminescence. Yet other eyes have retinal regions separately specialised for each type of light. In the bathypelagic zone, eyes generally get smaller and therefore less sensitive to point sources with increasing depth. In fishes, this insensitivity, combined with surprisingly high spatial resolution, is very well adapted to the detection and localisation of point-source bioluminescence at ecologically meaningful distances. At all depths, the eyes of animals active on and over the nutrient-rich sea floor are generally larger than the eyes of pelagic species. In fishes, the retinal ganglion cells are also frequently arranged in a horizontal visual streak, an adaptation for viewing the wide flat horizon of the sea floor, and all animals living there. These and many other aspects of light and vision in the deep sea are reviewed in support of the following conclusion: it is not only the intensity of light at different depths, but also its distribution in space, which has been a major force in the evolution of deep-sea vision.
Animals which need to see well at night generally have eyes with wide pupils. This optical strategy to improve photon capture may be improved neurally by summing the outputs of neighbouring visual channels (spatial summation) or by increasing the length of time a sample of photons is counted by the eye (temporal summation). These summation strategies only come at the cost of spatial and temporal resolution. A simple analytical model is developed to investigate whether the improved photon catch afforded by summation really improves vision in dim light, or whether the losses in resolution actually make vision worse. The model, developed for both vertebrate camera eyes and arthropod compound eyes, calculates the finest spatial detail perceivable by a given eye design at a specified light intensity and image velocity. Visual performance is calculated for the apposition compound eye of the locust, the superposition compound eye of the dung beetle and the camera eye of the nocturnal toad. The results reveal that spatial and temporal summation is extremely beneficial to vision in dim light, especially in small eyes (e.g. compound eyes), which have a restricted ability to collect photons optically. The model predicts that using optimum spatiotemporal summation the locust can extend its vision to light intensities more than 100,000 times dimmer than if it relied on its optics alone. The relative amounts of spatial and temporal summation predicted to be optimal in dim light depend on the image velocity. Animals which are sedentary and rely on seeing small, slow images (such as the toad) are predicted to rely more on temporal summation and less on spatial summation. The opposite strategy is predicted for animals which need to see large, fast images. The predictions of the model agree very well with the known visual behaviours of nocturnal animals.
Visual ecology is the study of how animals use visual systems to meet their ecological needs, how these systems have evolved, and how they are specialized for particular visual tasks. This book provides the first up-to-date synthesis of the field to appear in more than three decades. Featuring some 225 illustrations, including more than 140 in color, spread throughout the text, the book begins by discussing the basic properties of light and the optical environment. It then looks at how photoreceptors intercept light and convert it to usable biological signals, how the pigments and cells of vision vary among animals, and how the properties of these components affect a given receptor's sensitivity to light. The book goes on to examine how eyes and photoreceptors become specialized for an array of visual tasks, such as navigation, evading prey, mate choice, and communication. A timely and much-needed resource for students and researchers alike, the book also includes a glossary and a wealth of examples drawn from the full diversity of visual systems.
SUMMARY Recent studies have shown that certain nocturnal insect and vertebrate species have true color vision under nocturnal illumination. Thus, their vision is potentially affected by changes in the spectral quality of twilight and nocturnal illumination, due to the presence or absence of the moon,artificial light pollution and other factors. We investigated this in the following manner. First we measured the spectral irradiance (from 300 to 700 nm) during the day, sunset, twilight, full moon, new moon, and in the presence of high levels of light pollution. The spectra were then converted to both human-based chromaticities and to relative quantum catches for the nocturnal hawkmoth Deilephila elpenor, which has color vision. The reflectance spectra of various flowers and leaves and the red hindwings of D. elpenor were also converted to chromaticities and relative quantum catches. Finally, the achromatic and chromatic contrasts (with and without von Kries color constancy) of the flowers and hindwings against a leaf background were determined under the various lighting environments. The twilight and nocturnal illuminants were substantially different from each other, resulting in significantly different contrasts. The addition of von Kries color constancy significantly reduced the effect of changing illuminants on chromatic contrast, suggesting that, even in this light-limited environment,the ability of color vision to provide reliable signals under changing illuminants may offset the concurrent threefold decrease in sensitivity and spatial resolution. Given this, color vision may be more common in crepuscular and nocturnal species than previously considered.
Diurnal and nocturnal African dung beetles use celestial cues, such as the sun, the moon, and the polarization pattern, to roll dung balls along straight paths across the savanna. Although nocturnal beetles move in the same manner through the same environment as their diurnal relatives, they do so when light conditions are at least 1 million-fold dimmer. Here, we show, for the first time to our knowledge, that the celestial cue preference differs between nocturnal and diurnal beetles in a manner that reflects their contrasting visual ecologies. We also demonstrate how these cue preferences are reflected in the activity of compass neurons in the brain. At night, polarized skylight is the dominant orientation cue for nocturnal beetles. However, if we coerce them to roll during the day, they instead use a celestial body (the sun) as their primary orientation cue. Diurnal beetles, however, persist in using a celestial body for their compass, day or night. Compass neurons in the central complex of diurnal beetles are tuned only to the sun, whereas the same neurons in the nocturnal species switch exclusively to polarized light at lunar light intensities. Thus, these neurons encode the preferences for particular celestial cues and alter their weighting according to ambient light conditions. This flexible encoding of celestial cue preferences relative to the prevailing visual scenery provides a simple, yet effective, mechanism for enabling visual orientation at any light intensity.he blue sky is a rich source of visual cues that are used by many animals during orientation or navigation (1, 2). Besides the sun, celestial phenomena, such as the skylight intensity gradient or the more complex polarization pattern, can serve as references for spatial orientation (3-5). Polarized skylight is generated by scattered sunlight in the atmosphere, and to a terrestrial observer, the resulting alignment of the electric field vectors extends across the entire sky, forming concentric circles around the position of the sun (Fig. 1A). A similar distribution of brightness and polarization pattern is also created around the moon (6). Although this nocturnal pattern is 1 million-fold dimmer than the daylight pattern (6), some animals, such as South African ball-rolling dung beetles, can use this lunar polarization pattern for orientation (7). To avoid competition for food at the dung pile, these beetles detach a piece of dung, shape it into a ball, and roll it away along a straightline path. For this type of straight-line orientation, nocturnal beetles seem to rely exclusively on celestial cues (8), such as the moon or polarized light.As with all nocturnal animals, night-active beetles have to overcome a major challenge: They need to maintain high orientation precision even under extremely dim light conditions. Indeed, recent experiments have shown that nocturnal dung beetles orient at night with the same precision as their diurnal relatives during the day (9), an ability partly due to the fact that their eyes are considerably more sensiti...
Despite the scarcity of photons, Megalopta is able to visually orient to landmarks at night in a dark forest understory, an ability permitted by unusually sensitive apposition eyes and neural photon summation.
The fraction F of incident light absorbed by a photoreceptor of length l has traditionally been given by F = 1 - e-kl, where k is the absorption coefficient of the photoreceptor. Unfortunately, this widely-used expression is incorrect for absorption of the type of light most common in natural scenes--broad spectrum "white" light--and significantly over-estimates absorption. This is because the measured values of k are only valid at the absorbance peak wavelength of rhodopsin, whereas at other wavelengths (which the eye may also see) k is lower. We have accounted for the wavelength dependence of k and calculated the absorption of white light from four different natural radiant sources: the quantal irradiances of natural daylight and a patch of very blue sky, and the quantal reflections of soil and green foliage irradiated by natural daylight. Based on these results, a simple averaged correction for white light stimulation is derived, F = kl/(2.3 + kl), which is valid for a wide range of k and l, and therefore applicable to both vertebrate and invertebrate photoreceptors.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.