2002
DOI: 10.1109/tns.2002.998681
|View full text |Cite
|
Sign up to set email alerts
|

A concave prior penalizing relative differences for maximum-a-posteriori reconstruction in emission tomography

Abstract: A well-known problem with maximum likelihood reconstruction in emission tomography is the excessive noise propagation. To prevent this, the objective function is often extended with a Gibbs prior favoring smooth solutions. We hypothesize that the following three requirements should produce a useful and conservative Gibbs prior for emission tomography: 1) the prior function should be concave to ensure that the posterior has a unique maximum; 2) the prior should penalize relative differences rather than absolute… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
72
0
1

Year Published

2010
2010
2024
2024

Publication Types

Select...
10

Relationship

0
10

Authors

Journals

citations
Cited by 167 publications
(76 citation statements)
references
References 8 publications
0
72
0
1
Order By: Relevance
“…We reconstructed the measurement data with a listmode-based three-dimensional maximum likelihood expectation maximization algorithm including self-normalization and resolution recovery (MLEM-RM) (Salomon et al 2012). The reconstruction algorithm grouped the data into 16 ordered subsets with 5 sub-iterations each, regularized with a relative difference prior with dynamic edge preservation parameters (Andreyev et al 2017, Nuyts et al 2002), and used a voxel pitch of 0.25 mm. Additionally, the reconstruction employs a likelihood-based rejection of inter-crystal detector scatter (Gross-Weege et al 2016) and applied corrections for attenuation, scatter and randoms.…”
Section: Methodsmentioning
confidence: 99%
“…We reconstructed the measurement data with a listmode-based three-dimensional maximum likelihood expectation maximization algorithm including self-normalization and resolution recovery (MLEM-RM) (Salomon et al 2012). The reconstruction algorithm grouped the data into 16 ordered subsets with 5 sub-iterations each, regularized with a relative difference prior with dynamic edge preservation parameters (Andreyev et al 2017, Nuyts et al 2002), and used a voxel pitch of 0.25 mm. Additionally, the reconstruction employs a likelihood-based rejection of inter-crystal detector scatter (Gross-Weege et al 2016) and applied corrections for attenuation, scatter and randoms.…”
Section: Methodsmentioning
confidence: 99%
“…To prevent excessive noise propagation, the iterations can be stopped before full convergence, but at the expense of lesser quantitative accuracy. Alternatively, the objective function can be extended with a prior favouring smooth solutions [19] and such algorithms can achieve global convergence while retaining fast initial convergence speed [20]. Q.Clear, a Bayesian penalized likelihood technique, uses a relative difference penalty which is a function of the difference between neighbouring voxels as well as a function of their sum.…”
Section: Introductionmentioning
confidence: 99%
“…The Q.Clear protocol is a Bayesian penalized likelihood reconstruction algorithm which incorporates a penalty factor to control noise [13]. It includes time of flight (TOF) and point spread function (PSF), taking into account resolution-degrading effects such as positron range, photon non-collinearity, and detector-related effects including crystal widths, inter-crystal scattering, and inter-crystal penetration (depth of interaction effects).…”
Section: Methodsmentioning
confidence: 99%