Understanding the nanoscale chemical speciation of heterogeneous systems in their native environment is critical for several disciplines such as life and environmental sciences, biogeochemistry, and materials science. Synchrotron-based x-ray spectromicroscopy tools are widely used to understand the chemistry and morphology of complex material systems owing to their high penetration depth and sensitivity. The multi-dimensional (4D+) structure of spectromicroscopy data poses visualization and data reduction challenges. This paper reports the strategies for the visualization and analysis of spectromicroscopy data. We created a new graphical user interface and data analysis platform named XMIDAS to visualize spectromicroscopy data from both image and spectrum representations. The interactive data analysis toolkit combined conventional analysis methods with well-established machine learning classification algorithms (e.g. non-negative matrix factorization) for data reduction. The data visualization and analysis methodologies were then defined and optimized using a model particle aggregate with known chemical composition. Nanoprobe-based x-ray fluorescence (nano-XRF) and x-ray absorption near edge structure (nano-XANES) spectromicroscopy techniques were used to probe elemental and chemical state information of the aggregate sample. We illustrated the complete chemical speciation methodology of the model particle by using XMIDAS. Next, we demonstrated the application of this approach in detecting and characterizing nanoparticles associated with alveolar macrophages. Our multimodal approach combining nano-XRF, nano-XANES, and differential phase-contrast (DPC) imaging efficiently visualizes the chemistry of localized nanostructure with the morphology. We believe that the optimized data reduction strategies and tool development will facilitate the analysis of complex biological and environmental samples using x-ray spectromicroscopy techniques.
The NSLS-II network and computing infrastructure has been significantly updated recently. The re-IP process in 2020-2021 enabled the NSLS-II network to be routable to the rest of the BNL campus. Then, standardization of the operating systems and deployment procedures helped to deliver a consistent environment to workstations and servers used by all NSLS-II beamlines. In particular, the RedHat Enterprise Linux 8 was deployed to 700+ machines using the RedHat Satellite infrastructure management product, and all critical services (IOCs, databases, etc.) were migrated to the new OS. NFS users’ home directories are consistent across all of the machines, which eliminates the need for the individual configuration of the user environment on each host. The standard suite of software packages is available to the beamline staff and users, which includes the system packages (deployed via RPM) as well as the conda environments for data acquisition and analysis. Security measures were implemented to comply with the industry standards, which include multi-factor authentication (using Duo), secure screen lock for the beamline machines, and advanced access control to the experimental data that is stored in shared central storage available on all hosts. These major enhancements facilitated sharing the experimental data (currently for a number of selected beamlines, with a plan to extend it to the whole facility in the nearest future) with the users via an externally facing JupyterHub instance - https://jupyter.nsls2.bnl.gov. The beamlines keep using the Bluesky data acquisition framework to orchestrate their experiments, and the new infrastructure enabled them to use a next-generation data access library called tiled.
Recent developments in 4th generation light sources and high-speed detectors are leading to rapid growth in data rates and data volumes, increasing the demand for automated data collection, handling/reduction/storage, and analysis processes. In combination with limited in-person access to experimental setups in times of the pandemic, portable and user-friendly tools for remote access as well as improved workflows are critical for enabling scientists from various disciplines to leverage ptychographic imaging to answer scientific questions.With the growing popularity of ptychography, a broad range of data formats, acquisition schemes, and algorithms has been developed over the years, e.g. [1][2][3]. Whereas this variety has been advantageous to tackle different real-world deviations from the ideal ptychographic model such as partial incoherence [4], positioning errors [5], broad-bandwidth radiation [6], or multi-scattering [7], it also complicates the comparability and reproducibility of results. With ptychography being established as an everyday workhorse technique at many instruments around the world, it is important to find common ground and establish standards to support reliable algorithm and collaborative software development addressing the big data challenges of today and the future.In this presentation, I will cover recent cross-facility efforts [8] to develop and promote data standards for ptychography. Furthermore, I will give an overview of ongoing software development at the Advanced Light Source in collaboration with the other DOE light sources for building data acquisition and analysis tools leveraging existing python packages with an outlook for future progress in terms of remote access and workflows.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.