In the context of next generation radio telescopes, like the Square Kilometre Array, the efficient processing of large-scale datasets is extremely important. Convex optimisation tasks under the compressive sensing framework have recently emerged and provide both enhanced image reconstruction quality and scalability to increasingly larger data sets. We focus herein mainly on scalability and propose two new convex optimisation algorithmic structures able to solve the convex optimisation tasks arising in radio-interferometric imaging. They rely on proximal splitting and forward-backward iterations and can be seen, by analogy with the clean major-minor cycle, as running sophisticated clean-like iterations in parallel in multiple data, prior, and image spaces. Both methods support any convex regularisation function, in particular the well studied 1 priors promoting image sparsity in an adequate domain. Tailored for big-data, they employ parallel and distributed computations to achieve scalability, in terms of memory and computational requirements. One of them also exploits randomisation, over data blocks at each iteration, offering further flexibility. We present simulation results showing the feasibility of the proposed methods as well as their advantages compared to state-of-the-art algorithmic solvers. Our Matlab code is available online on GitHub.
Next-generation radio interferometers, such as the Square Kilometre Array (SKA), will revolutionise our understanding of the universe through their unprecedented sensitivity and resolution. However, to realise these goals significant challenges in image and data processing need to be overcome. The standard methods in radio interferometry for reconstructing images, such as CLEAN, have served the community well over the last few decades and have survived largely because they are pragmatic. However, they produce reconstructed interferometric images that are limited in quality and scalability for big data. In this work we apply and evaluate alternative interferometric reconstruction methods that make use of state-of-the-art sparse image reconstruction algorithms motivated by compressive sensing, which have been implemented in the PURIFY software package. In particular, we implement and apply the proximal alternating direction method of multipliers (P-ADMM) algorithm presented in a recent article. First, we assess the impact of the interpolation kernel used to perform gridding and degridding on sparse image reconstruction. We find that the Kaiser-Bessel interpolation kernel performs as well as prolate spheroidal wave functions, while providing a computational saving and an analytic form. Second, we apply PURIFY to real interferometric observations from the Very Large Array (VLA) and the Australia Telescope Compact Array (ATCA) and find images recovered by PURIFY are higher quality than those recovered by CLEAN. Third, we discuss how PURIFY reconstructions exhibit additional advantages over those recovered by CLEAN. The latest version of PURIFY, with developments presented in this work, is made publicly available.
Next generation radio-interferometers, like the Square Kilometre Array, will acquire large amounts of data with the goal of improving the size and sensitivity of the reconstructed images by orders of magnitude. The efficient processing of large-scale data sets is of great importance. We propose an acceleration strategy for a recently proposed primal-dual distributed algorithm. A preconditioning approach can incorporate into the algorithmic structure both the sampling density of the measured visibilities and the noise statistics. Using the sampling density information greatly accelerates the convergence speed, especially for highly non-uniform sampling patterns, while relying on the correct noise statistics optimises the sensitivity of the reconstruction. In connection to clean, our approach can be seen as including in the same algorithmic structure both natural and uniform weighting, thereby simultaneously optimising both the resolution and the sensitivity. The method relies on a new non-Euclidean proximity operator for the data fidelity term, that generalises the projection onto the 2 ball where the noise lives for naturally weighted data, to the projection onto a generalised ellipsoid incorporating sampling density information through uniform weighting. Importantly, this non-Euclidean modification is only an acceleration strategy to solve the convex imaging problem with data fidelity dictated only by noise statistics. We show through simulations with realistic sampling patterns the acceleration obtained using the preconditioning. We also investigate the algorithm performance for the reconstruction of the 3C129 radio galaxy from real visibilities and compare with multi-scale clean, showing better sensitivity and resolution. Our matlab code is available online on GitHub.
We leverage the Sparsity Averaging Reweighted Analysis (SARA) approach for interferometric imaging, that is based on convex optimisation, for the super-resolution of Cyg A from observations at the frequencies 8.422GHz and 6.678GHz with the Karl G. Jansky Very Large Array (VLA). The associated average sparsity and positivity priors enable image reconstruction beyond instrumental resolution. An adaptive Preconditioned Primal-Dual algorithmic structure is developed for imaging in the presence of unknown noise levels and calibration errors. We demonstrate the superior performance of the algorithm with respect to the conventional clean-based methods, reflected in super-resolved images with high fidelity. The high resolution features of the recovered images are validated by referring to maps of Cyg A at higher frequencies, more precisely 17.324GHz and 14.252GHz. We also confirm the recent discovery of a radio transient in Cyg A, revealed in the recovered images of the investigated data sets. Our matlab code is available online on GitHub.
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.