We present a computational model for target discrimination based on intracellular recordings from neurons in the fly visual system. Determining how insects detect and track small moving features, often against cluttered moving backgrounds, is an intriguing challenge, both from a physiological and a computational perspective. Previous research has characterized higher-order neurons within the fly brain, known as ‘small target motion detectors’ (STMD), that respond robustly to moving features, even when the velocity of the target is matched to the background (i.e. with no relative motion cues). We recorded from intermediate-order neurons in the fly visual system that are well suited as a component along the target detection pathway. This full-wave rectifying, transient cell (RTC) reveals independent adaptation to luminance changes of opposite signs (suggesting separate ON and OFF channels) and fast adaptive temporal mechanisms, similar to other cell types previously described. From this physiological data we have created a numerical model for target discrimination. This model includes nonlinear filtering based on the fly optics, the photoreceptors, the 1st order interneurons (Large Monopolar Cells), and the newly derived parameters for the RTC. We show that our RTC-based target detection model is well matched to properties described for the STMDs, such as contrast sensitivity, height tuning and velocity tuning. The model output shows that the spatiotemporal profile of small targets is sufficiently rare within natural scene imagery to allow our highly nonlinear ‘matched filter’ to successfully detect most targets from the background. Importantly, this model can explain this type of feature discrimination without the need for relative motion cues.
When a human catches a ball, they estimate future target location based on the current trajectory. How animals, small and large, encode such predictive processes at the single neuron level is unknown. Here we describe small target-selective neurons in predatory dragonflies that exhibit localized enhanced sensitivity for targets displaced to new locations just ahead of the prior path, with suppression elsewhere in the surround. This focused region of gain modulation is driven by predictive mechanisms, with the direction tuning shifting selectively to match the target’s prior path. It involves a large local increase in contrast gain which spreads forward after a delay (e.g. an occlusion) and can even transfer between brain hemispheres, predicting trajectories moved towards the visual midline from the other eye. The tractable nature of dragonflies for physiological experiments makes this a useful model for studying the neuronal mechanisms underlying the brain’s remarkable ability to anticipate moving stimuli.DOI: http://dx.doi.org/10.7554/eLife.26478.001
Animals need attention to focus on one target amid alternative distracters. Dragonflies, for example, capture flies in swarms comprising prey and conspecifics, a feat that requires neurons to select one moving target from competing alternatives. Diverse evidence, from functional imaging and physiology to psychophysics, highlights the importance of such "competitive selection" in attention for vertebrates. Analogous mechanisms have been proposed in artificial intelligence and even in invertebrates, yet direct neural correlates of attention are scarce from all animal groups. Here, we demonstrate responses from an identified dragonfly visual neuron that perfectly match a model for competitive selection within limits of neuronal variability (r(2) = 0.83). Responses to individual targets moving at different locations within the receptive field differ in both magnitude and time course. However, responses to two simultaneous targets exclusively track those for one target alone rather than any combination of the pair. Irrespective of target size, contrast, or separation, this neuron selects one target from the pair and perfectly preserves the response, regardless of whether the "winner" is the stronger stimulus if presented alone. This neuron is amenable to electrophysiological recordings, providing neuroscientists with a new model system for studying selective attention.
Dragonflies detect and pursue targets such as other insects for feeding and conspecific interaction. They have a class of neurons highly specialized for this task in their lobula, the “small target motion detecting” (STMD) neurons. One such neuron, CSTMD1, reaches maximum response slowly over hundreds of milliseconds of target motion. Recording the intracellular response from CSTMD1 and a second neuron in this system, BSTMD1, we determined that for the neurons to reach maximum response levels, target motion must produce sequential local activation of elementary motion detecting elements. This facilitation effect is most pronounced when targets move at velocities slower than what was previously thought to be optimal. It is completely disrupted if targets are instantaneously displaced a few degrees from their current location. Additionally, we utilize a simple computational model to discount the parsimonious hypothesis that CSTMD1's slow build-up to maximum response is due to it incorporating a sluggish neural delay filter. Whilst the observed facilitation may be too slow to play a role in prey pursuit flights, which are typically rapidly resolved, we hypothesize that it helps maintain elevated sensitivity during prolonged, aerobatically intricate conspecific pursuits. Since the effect seems to be localized, it most likely enhances the relative salience of the most recently “seen” locations during such pursuit flights.
The relation between lecture attendance and learning is surprisingly weak, and the role of learning styles in this is poorly understood. We hypothesized that 1) academic performance is related to lecture attendance and 2) learning style influences lecture attendance and, consequently, affects performance. We also speculated that the availability of alternative resources would affect this relationship. Second-year Bachelor of Science physiology students (n = 120) self-reported their lecture attendance in a block of 21 lectures (attendance not compulsory) and use of alternative resources. Overall self-reported lecture attendance was 73 ± 2%. Female students (n = 71) attended more lectures (16.4 ± 0.6) than male students (14.3 ± 0.08, n = 49) and achieved a higher composite mark in all assessments (73.6% vs. 69.3%, P < 0.02). Marks in the final exam were not statistically different between the sexes and correlated only weakly with lecture attendance (r = 0.29, n = 49, P < 0.04 for male students; r = 0.10, n = 71, P = not significant for female students; and r =0.21, n = 120, P < 0.02 for the whole class). Of the students who passed the exam, poor attenders (<11 lectures) reported significantly more use of lecture recordings (37 ± 8%, n = 15, vs. 10 ± 1%, n = 85, P < 0.001). In a VARK learning style assessment (where V is visual, A is auditory, R is reading/writing, and K is kinesthetic), students were multimodal, although female students had a slightly higher average percentage of the R learning style (preferred read/write) compared with male students (28.9 ± 0.9%, n = 63, vs. 25.3 ± 1.3%, n = 32, P < 0.03). Lecture attendance was not correlated with measured learning style. We concluded that lecture attendance is only weakly correlated with academic performance and is not related to learning style. The substitution of alternative materials for lecture attendance appears to have a greater role than learning style in determining academic outcomes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.