Scattering often limits the controlled delivery of light in applications such as biomedical imaging, optogenetics, optical trapping, and fiber-optic communication or imaging. Such scattering can be controlled by appropriately shaping the light wavefront entering the material. Here, we develop a machine-learning approach for light control. Using pairs of binary intensity patterns and intensity measurements we train neural networks (NNs) to provide the wavefront corrections necessary to shape the beam after the scatterer. Additionally, we demonstrate that NNs can be used to find a functional relationship between transmitted and reflected speckle patterns. Establishing the validity of this relationship, we focus and scan in transmission through opaque media using reflected light. Our approach shows the versatility of NNs for light shaping, for efficiently and flexibly correcting for scattering, and in particular the feasibility of transmission control based on reflected light. * Supplementary videos available in: https://www.osapublishing.org/oe/abstract.cfm?uri=oe-26-23-30911 † These two authors contributed equally ‡
Light scattering and aberrations limit optical microscopy in biological tissue, which motivates the development of adaptive optics techniques. Here, we develop a method for wavefront correction in adaptive optics with reflected light and deep neural networks compatible with an epi-detection configuration. Large datasets of sample aberrations which consist of excitation and detection path aberrations as well as the corresponding reflected focus images are generated. These datasets are used for training deep neural networks. After training, these networks can disentangle and independently correct excitation and detection aberrations based on reflected light images recorded from scattering samples. A similar deep learning approach is also demonstrated with scattering guide stars. The predicted aberration corrections are validated using two photon imaging.
Background Virtual reality combined with spherical treadmills is used across species for studying neural circuits underlying navigation.New Method We developed an optical flow-based method for tracking treadmil ball motion in real-time using a single high-resolution camera.Results Tracking accuracy and timing were determined using calibration data. Ball tracking was performed at 500 Hz and integrated with an open source game engine for virtual reality projection. The projection was updated at 120 Hz with a latency with respect to ball motion of 30 ± 8 ms.Comparison with Existing Method(s) Optical flow based tracking of treadmill motion is typically achieved using optical mice. The camera-based optical flow tracking system developed here is based on o↵-the-shelf components and o↵ers control over the image acquisition and processing parameters. This results in flexibility with respect to tracking conditions -such as ball surface texture, lighting conditions, or ball size -as well as camera alignment and calibration.Conclusions A fast system for rotational ball motion tracking suitable for virtual reality animal behavior across di↵erent scales was developed and characterized. 2 underlying behavior [1]. In many implementations, animals navigate through 3 virtual realities on a spherical treadmill -a ball which can be freely rotated 4 around its center of mass [1, 2, 3, 4, 5]. 5 Tracking of ball rotation is typically accomplished using optical mice 6 which are based on low-resolution, high-speed cameras integrated with a light 7 source for measuring displacements when moving across a surface. Movement 8 across the surface results in optical flow -the displacement of features across 9 the camera sensor. Such features can for example be speckle-like reflections 10 from surface roughness; comparing these speckle images between di↵erent 11 frames then allows computing the displacement using hardware-integrated 12 image processing. 13 Optical mice come however with some limitations for measuring ball ro-14 tation. First, a single optical mouse measures displacements only in two 15 directions and therefore two mice are required for tracking all three degrees 16 of freedom of ball motion. Secondly, limited or no control over the onboard 17 processing algorithms as well as camera settings requires careful calibration. 18 In particular, if the mouse sensors can't be placed in direct proximity of 19 the ball surface, accurate alignment of the two sensors as well as calibration 20 with respect to surface properties and lighting conditions is necessary [5]. 21As an approach that overcomes some of these limitations, real-time track-22 ing was developed with a single high-resolution camera for situations where a 23 uniquely patterned ball can be used [6]. In that case, ball orientation was cal-24 culated by matching each recorded frame to a map of the entire ball surface 25 pattern. Using a high-resolution cameras allows control over all recording 26 parameters and o↵ers the freedom to choose a custom algorithms and test 27 its per...
Aberrations limit optical systems in many situations, for example when imaging in biological tissue. Machine learning offers novel ways to improve imaging under such conditions by learning inverse models of aberrations. Learning requires datasets that cover a wide range of possible aberrations, which however becomes limiting for more strongly scattering samples, and does not take advantage of prior information about the imaging process. Here, we show that combining model-based adaptive optics with the optimization techniques of machine learning frameworks can find aberration corrections with a small number of measurements. Corrections are determined in a transmission configuration through a single aberrating layer and in a reflection configuration through two different layers at the same time. Additionally, corrections are not limited by a predetermined model of aberrations (such as combinations of Zernike modes). Focusing in transmission can be achieved based only on reflected light, compatible with an epidetection imaging configuration.
Background Virtual reality combined with spherical treadmills is used across species for studying neural circuits underlying navigation.New Method We developed an optical flow-based method for tracking treadmil ball motion in real-time using a single high-resolution camera.Results Tracking accuracy and timing were determined using calibration data. Ball tracking was performed at 500 Hz and integrated with an open source game engine for virtual reality projection. The projection was updated at 120 Hz with a latency with respect to ball motion of 30 ± 8 ms.Comparison with Existing Method(s) Optical flow based tracking of treadmill motion is typically achieved using optical mice. The camera-based optical flow tracking system developed here is based on o↵-the-shelf components and o↵ers control over the image acquisition and processing parameters. This results in flexibility with respect to tracking conditions -such as ball surface texture, lighting conditions, or ball size -as well as camera alignment and calibration.Conclusions A fast system for rotational ball motion tracking suitable for virtual reality animal behavior across di↵erent scales was developed and characterized.Virtual reality (VR) is used across species for studying neural circuits 2 underlying behavior [1]. In many implementations, animals navigate through 3 virtual realities on a spherical treadmill -a ball which can be freely rotated 4 around its center of mass [1, 2, 3, 4, 5]. 5Tracking of ball rotation is typically accomplished using optical mice 6 which are based on low-resolution, high-speed cameras integrated with a light 7 source for measuring displacements when moving across a surface. Movement 8 across the surface results in optical flow -the displacement of features across 9 the camera sensor. Such features can for example be speckle-like reflections 10 from surface roughness; comparing these speckle images between di↵erent 11 frames then allows computing the displacement using hardware-integrated 12 image processing. 13Optical mice come however with some limitations for measuring ball ro-14 tation. First, a single optical mouse measures displacements only in two 15 directions and therefore two mice are required for tracking all three degrees 16 of freedom of ball motion. Secondly, limited or no control over the onboard 17 processing algorithms as well as camera settings requires careful calibration. 18In particular, if the mouse sensors can't be placed in direct proximity of 19 the ball surface, accurate alignment of the two sensors as well as calibration 20 with respect to surface properties and lighting conditions is necessary [5]. 21 As an approach that overcomes some of these limitations, real-time track-22 ing was developed with a single high-resolution camera for situations where a 23 uniquely patterned ball can be used [6]. In that case, ball orientation was cal-24 culated by matching each recorded frame to a map of the entire ball surface 25 pattern. Using a high-resolution cameras allows control over all recording 26 parameter...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.