This paper describes a statistical image reconstruction method for X-ray computed tomography (CT) that is based on a physical model that accounts for the polyenergetic X-ray source spectrum and the measurement nonlinearities caused by energy-dependent attenuation. We assume that the object consists of a given number of nonoverlapping materials, such as soft tissue and bone. The attenuation coefficient of each voxel is the product of its unknown density and a known energy-dependent mass attenuation coefficient. We formulate a penalized-likelihood function for this polyenergetic model and develop an ordered-subsets iterative algorithm for estimating the unknown densities in each voxel. The algorithm monotonically decreases the cost function at each iteration when one subset is used. Applying this method to simulated X-ray CT measurements of objects containing both bone and soft tissue yields images with significantly reduced beam hardening artifacts.
This paper describes a statistical image reconstruction method for x-ray CT that is based on a physical model that accounts for the polyenergetic x-ray source spectrum and the measurement nonlinearities caused by energy-dependent attenuation. Unlike our earlier work, the proposed algorithm does not require pre-segmentation of the object into the various tissue classes (e.g., bone and soft tissue) and allows mixed pixels. The attenuation coefficient of each voxel is modelled as the product of its unknown density and a weighted sum of energy-dependent mass attenuation coefficients. We formulate a penalized-likelihood function for this polyenergetic model and develop an iterative algorithm for estimating the unknown density of each voxel. Applying this method to simulated x-ray CT measurements of objects containing both bone and soft tissue yields images with significantly reduced beam hardening artefacts relative to conventional beam hardening correction methods. We also apply the method to real data acquired from a phantom containing various concentrations of potassium phosphate solution. The algorithm reconstructs an image with accurate density values for the different concentrations, demonstrating its potential for quantitative CT applications.
We report a novel approach for statistical image reconstruction in X-ray CT. Statistical image reconstruction depends on maximizing a likelihood derived from a statistical model for the measurements. Traditionally, the measurements are assumed to be statistically Poisson, but more recent work has argued that CT measurements actually follow a compound Poisson distribution due to the polyenergetic nature of the X-ray source. Unlike the Poisson distribution, compound Poisson statistics have a complicated likelihood that impedes direct use of statistical reconstruction. Using a generalization of the saddle-point integration method, we derive an approximate likelihood for use with iterative algorithms. In its most realistic form, the approximate likelihood we derive accounts for polyenergetic X-rays and Poisson light statistics in the detector scintillator, and can be extended to account for electronic additive noise. The approximate likelihood is closer to the exact likelihood than is the conventional Poisson likelihood, and carries the promise of more accurate reconstruction, especially in low X-ray dose situations.
Dual-energy (DE) X-ray computed tomography (CT) has shown promise for material characterization and for providing quantitatively accurate CT values in a variety of applications. However, DE-CT has not been used routinely in medicine to date, primarily due to dose considerations. Most methods for DE-CT have used the filtered backprojection method for image reconstruction, leading to suboptimal noise/dose properties. This paper describes a statistical (maximum-likelihood) method for dual-energy X-ray CT that accommodates a wide variety of potential system configurations and measurement noise models. Regularized methods (such as penalized-likelihood or Bayesian estimation) are straightforward extensions. One version of the algorithm monotonically decreases the negative log-likelihood cost function each iteration. An ordered-subsets variation of the algorithm provides a fast and practical version.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.