Tomographic reconstruction is an ill-posed inverse problem that calls for regularization. One possibility is to require sparsity of the unknown in an orthonormal wavelet basis. This, in turn, can be achieved by variational regularization, where the penalty term is the sum of the absolute values of the wavelet coefficients. The primal-dual fixed point (PDFP) algorithm introduced by Peijun Chen, Jianguo Huang, and Xiaoqun Zhang (Fixed Point Theory and Applications 2016) showed that the minimizer of the variational regularization functional can be computed iteratively using a soft-thresholding operation. Choosing the soft-thresholding parameter µ > 0 is analogous to the notoriously difficult problem of picking the optimal regularization parameter in Tikhonov regularization. Here, a novel automatic method is introduced for choosing µ, based on a control algorithm driving the sparsity of the reconstruction to an a priori known ratio of nonzero versus zero wavelet coefficients in the unknown.
In this paper, we consider prior-based dimension reduction Kalman filter for undersampled dynamic X-ray tomography. With this method, the X-ray reconstructions are parameterized by a low-dimensional basis. Thus, the proposed method is a) computationally very light; and b) extremely robust as all the computations can be done explicitly. With real and simulated measurement data, we show that the method provides accurate reconstructions even with very limited number of angular directions.
Classical tomographic imaging is soundly understood and widely employed in medicine, nondestructive testing and security applications. However, it still offers many challenges when it comes to dynamic tomography. Indeed, in classical tomography, the target is usually assumed to be stationary during the data acquisition, but this is not a realistic model. Moreover, to ensure a lower X-ray radiation dose, only a sparse collection of measurements per time step is assumed to be available. With such a set up, we deal with a sparse data, dynamic tomography problem, which clearly calls for regularization, due to the loss of information in the data and the ongoing motion. In this paper, we propose a 3D variational formulation based on 3D shearlets, where the third dimension accounts for the motion in time, to reconstruct a moving 2D object. Results are presented for real measured data and compared against a 2D static model, in the case of fan-beam geometry. Results are preliminary but show that better reconstructions can be achieved when motion is taken into account.
In this work, we consider the inverse problem of reconstructing the internal structure of an object from limited x-ray projections. We use a Gaussian process prior to model the target function and estimate its (hyper)parameters from measured data. In contrast to other established methods, this comes with the advantage of not requiring any manual parameter tuning, which usually arises in classical regularization strategies. Our method uses a basis function expansion technique for the Gaussian process which significantly reduces the computational complexity and avoids the need for numerical integration. The approach also allows for reformulation of come classical regularization methods as Laplacian and Tikhonov regularization as Gaussian process regression, and hence provides an efficient algorithm and principled means for their parameter tuning. Results from simulated and real data indicate that this approach is less sensitive to streak artifacts as compared to the commonly used method of filtered backprojection.
In this article, we study Bayesian inverse problems with multi-layered Gaussian priors. The aim of the multi-layered hierarchical prior is to provide enough complexity structure to allow for both smoothing and edge-preserving properties at the same time. We first describe the conditionally Gaussian layers in terms of a system of stochastic partial differential equations. We then build the computational inference method using a finite-dimensional Galerkin method. We show that the proposed approximation has a convergence-in-probability property to the solution of the original multi-layered model. We then carry out Bayesian inference using the preconditioned Crank–Nicolson algorithm which is modified to work with multi-layered Gaussian fields. We show via numerical experiments in signal deconvolution and computerized x-ray tomography problems that the proposed method can offer both smoothing and edge preservation at the same time.
A two-dimensional sparse-data tomographic problem is studied. The target is assumed to be a homogeneous object bounded by a smooth curve. A nonuniform rational basis splines (NURBS) curve is used as a computational representation of the boundary. This approach conveniently provides the result in a format readily compatible with computer-aided design software. However, the linear tomography task becomes a nonlinear inverse problem because of the NURBS-based parameterization. Therefore, Bayesian inversion with Markov chain Monte Carlo sampling is used for calculating an estimate of the NURBS control points. The reconstruction method is tested with both simulated data and measured X-ray projection data. The proposed method recovers the shape and the attenuation coefficient significantly better than the baseline algorithm (optimally thresholded total variation regularization), but at the cost of heavier computation.
We present the current status of our project of developing a photon counting detector for medical imaging. An example motivation lays in producing a monitoring and dosimetry device for boron neutron capture therapy, currently not commercially available.Our approach combines in-house developed detectors based on cadmium telluride or thick silicon with readout chip technology developed for particle physics experiments at CERN.Here we describe the manufacturing process of our sensors as well as the processing steps for the assembly of first prototypes. The prototypes use currently the PSI46digV2.1-r readout chip. The accompanying readout electronics chain that was used for first measurements will also be discussed. Finally we present an advanced algorithm developed by us for image reconstruction using such photon counting detectors with focus on boron neutron capture therapy.This work is conducted within a consortium
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.