Focal plane wavefront sensing (FPWFS) is appealing for several reasons. Notably, it offers high sensitivity and does not suffer from non-common path aberrations (NCPA). The price to pay is a high computational burden and the need for diversity to lift any phase ambiguity. If those limitations can be overcome, FPWFS is a great solution for NCPA measurement, a key limitation for high-contrast imaging, and could be used as adaptive optics wavefront sensor. Here, we propose to use deep convolutional neural networks (CNNs) to measure NCPA based on focal plane images. Two CNN architectures are considered: ResNet-50 and U-Net which are used respectively to estimate Zernike coefficients or directly the phase. The models are trained on labelled datasets and evaluated at various flux levels and for two spatial frequency contents (20 and 100 Zernike modes). In these idealized simulations we demonstrate that the CNN-based models reach the photon noise limit in a large range of conditions. We show, for example, that the root mean squared (rms) wavefront error (WFE) can be reduced to <λ/1500 for 2 × 106 photons in one iteration when estimating 20 Zernike modes. We also show that CNN-based models are sufficiently robust to varying signal-to-noise ratio, under the presence of higher-order aberrations, and under different amplitudes of aberrations. Additionally, they display similar to superior performance compared to iterative phase retrieval algorithms. CNNs therefore represent a compelling way to implement FPWFS, which can leverage the high sensitivity of FPWFS over a broad range of conditions.
Context. The performance of high-contrast imaging instruments is limited by wavefront errors, in particular by non-common path aberrations (NCPAs). Focal-plane wavefront sensing (FPWFS) is appropriate to handle NCPAs because it measures the aberration where it matters the most, that is to say at the science focal plane. Phase retrieval from focal-plane images results, nonetheless, in a sign ambiguity for even modes of the pupil-plane phase. Aims. The phase diversity methods currently used to solve the sign ambiguity tend to reduce the science duty cycle, that is, the fraction of observing time dedicated to science. In this work, we explore how we can combine the phase diversity provided by a vortex coronagraph with modern deep learning techniques to perform efficient FPWFS without losing observing time. Methods. We applied the state-of-the-art convolutional neural network EfficientNet-B4 to infer phase aberrations from simulated focal-plane images. The two cases of scalar and vector vortex coronagraphs (SVC and VVC) were considered using a single postcoronagraphic point spread function (PSF) or two PSFs obtained by splitting the circular polarization states, respectively. Results. The sign ambiguity has been properly lifted in both cases even at low signal-to-noise ratios (S/Ns). Using either the SVC or the VVC, we have reached a very similar performance compared to using phase diversity with a defocused PSF, except for high levels of aberrations where the SVC slightly underperforms compared to the other approaches. The models finally show great robustness when trained on data with a wide range of wavefront errors and noise levels. Conclusions. The proposed FPWFS technique provides a 100% science duty cycle for instruments using a vortex coronagraph and does not require any additional hardware in the case of the SVC.
High-contrast imaging instruments are today primarily limited by non-common path aberrations appearing between the wavefront sensor of the adaptive optics system and the science camera. Early attempts at using artificial neural networks for focal-plane wavefront sensing showed some successful results but today's higher computational power and deep architectures promise increased performance, flexibility and robustness that have yet to be exploited. We implement two convolutional neural networks (CNN) to estimate wavefront errors from simulated point-spread functions in both low and high aberration regimes. We then extend our CNN model by a mixture density network (MDN) and show that it can assess the ambiguity on the phase sign by predicting each Zernike coefficient as a probability distribution. Our method is also applied with the Vector Vortex coronagraph (VVC), comparing the phase retrieval performance with classical imaging. Finally, preliminary results indicate that the VVC combined with polarized light can lift the sign ambiguity.
Instrumental aberrations strongly limit high-contrast imaging of exoplanets, especially when they produce quasistatic speckles in the science images. With the help of recent advances in deep learning, we have developed in previous works an approach that applies convolutional neural networks (CNN) to estimate pupil-plane phase aberrations from point spread functions (PSF). In this work we take a step further by incorporating into the deep learning architecture the physical simulation of the optical propagation occurring inside the instrument. This is achieved with an autoencoder architecture, which uses a differentiable optical simulator as the decoder. Because this unsupervised learning approach reconstructs the PSFs, knowing the true phase is not needed to train the models, making it particularly promising for on-sky applications. We show that the performance of our method is almost identical to a standard CNN approach, and that the models are sufficiently stable in terms of training and robustness. We notably illustrate how we can benefit from the simulator-based autoencoder architecture by quickly fine-tuning the models on a single test image, achieving much better performance when the PSFs contain more noise and aberrations. These early results are very promising and future steps have been identified to apply the method on real data.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.