Astigmatic optical systems encode the depth location of spherical objects in the defocus blur of their images. This allows the simultaneous imaging of 3D positions of a large number of such objects, which can act as tracer particles in the study of fluid flows. The challenge lies in decoding the depth information, as defocused particle images might be overlapping or have low maximum intensity values. Current methods are not able to simultaneously detect and locate overlapping and low-intensity particle images. In addition, their cost of computation increases with particle image density. We show how semi-synthetic images of defocused particle images with proximate center point positions can be employed to train an end-to-end trainable particle image detector. This allows for the detection of low-intensity and overlapping particle images in a single pass of an image through a neural network. We present a thorough evaluation of the uncertainty of the method for the application of particles in fluid flow measurements. We achieve a similar error in the depth predictions to previous algorithms for non-overlapping particle images. In the case of neighboring particle images, the location error increases with decreasing particle image center distances and peaks when particle image centers share the same location. When dealing with actual measurement images, the location error increases by approximately a factor of two when particle images share the same center point locations. The trained model detects low-intensity particle images close to the visibility limit and covers 91.4% of the depth range of a human annotator. For the employed experimental arrangement, this increased the depth range along which particle images can be detected by 67% over a previously employed thresholding detection method (Franchini et al. in Adv Water Resour 124:1–8, 2019). Graphic abstract
Flow and transport in porous media are driven by pore scale processes. Particle tracking in transparent porous media allows for the observation of these processes at the time scale of ms. We demonstrate an application of defocusing particle tracking using brightfield illumination and a CMOS camera sensor. The resulting images have relatively high noise levels. To address this challenge, we propose a new calibration for locating particles in the out-of-plane direction. The methodology relies on extracting features of particle images by fitting generalized Gaussian distributions to particle images. The resulting fitting parameters are then linked to the out-of-plane coordinates of particles using flexible machine learning tools. A workflow is presented which shows how to generate a training dataset of fitting parameters paired to known out-of-plane locations. Several regression models are tested on the resulting training dataset, of which a boosted regression tree ensemble produced the lowest cross-validation error. The efficiacy of the proposed methodology is then examined in a laminar channel flow in a large measurement volume of 2048, 1152 and 3000 μm in length, width and depth respectively. The size of the test domain reflects the representative elementary volume of many fluid flow phenomena in porous media. Such large measurement depths require the collection of images at different focal levels. We acquired images at 21 focal levels 150 μm apart from each other. The error in predicting the out-of-plane location in a single slice of 240 μm thickness was found to be 7 μm, while in-plane locations were determined with sub-pixel resolution (below 0.8 μm). The mean relative error in the velocity measurement was obtained by comparing the experimental results to an analytic model of the flow. The estimated displacement errors in the axial direction of the flow were 0.21 pixel and 0.22 pixel at flows rates of 1.0 mL/h and 2.5 mL/h, respectively. These results demonstrate that it is possible to conduct three-dimensional particle tracking in a representative elementary volume based on a simple apparatus comprising a microscope with standard brightfield illumination and a camera with CMOS sensor.
Flow and transport in porous media are driven by pore scale processes. Particle tracking in transparent porous media allows for the observation of these processes at the time scale of ms. We demonstrate an application of defocusing particle tracking using brightfield illumination and a CMOS camera sensor. The resulting images have relatively high noise levels. To address this challenge, we propose a new calibration for locating particles in the out-of-plane direction. The methodology relies on extracting features of particle images by fitting generalized Gaussian distributions to particle images. The resulting fitting parameters are then linked to the out-of-plane coordinates of particles using flexible machine learning tools. A workflow is presented which shows how to generate a training dataset of fitting parameters paired to known out-of-plane locations. Several regression models are tested on the resulting training dataset, of which a boosted regression tree ensemble produced the lowest cross-validation error. The efficiacy of the proposed methodology is then examined in a laminar channel flow in a large measurement volume of 2048, 1152 and 3000 μm in length, width and depth respectively. The size of the test domain reflects the representative elementary volume of many fluid flow phenomena in porous media. Such large measurement depths require the collection of images at different focal levels. We acquired images at 21 focal levels 150 μm apart from each other. The error in predicting the out-of-plane location in a single slice of 240 μm thickness was found to be 7 μm, while in-plane locations were determined with sub-pixel resolution (below 0.8 μm). The mean relative error in the velocity measurement was obtained by comparing the experimental results to an analytic model of the flow. The estimated displacement errors in the axial direction of the flow were 0.21 pixel and 0.22 pixel at flows rates of 1.0 mL/h and 2.5 mL/h, respectively. These results demonstrate that it is possible to conduct three-dimensional particle tracking in a representative elementary volume based on a simple apparatus comprising a microscope with standard brightfield illumination and a camera with CMOS sensor.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.