Purpose Respiratory motion of patients during positron emission tomography (PET)/computed tomography (CT) imaging affects both image quality and quantitative accuracy. Hardware‐based motion estimation, which is the current clinical standard, requires initial setup, maintenance, and calibration of the equipment, and can be associated with patient discomfort. Data‐driven techniques are an active area of research with limited exploration into lesion‐specific motion estimation. This paper introduces a time‐of‐flight (TOF)‐weighted positron emission particle tracking (PEPT) algorithm that facilitates lesion‐specific respiratory motion estimation from raw listmode PET data. Methods The TOF‐PEPT algorithm was implemented and investigated under different scenarios: (a) a phantom study with a point source and an Anzai band for respiratory motion tracking; (b) a phantom study with a point source only, no Anzai band; (c) two clinical studies with point sources and the Anzai band; (d) two clinical studies with point sources only, no Anzai band; and (e) two clinical studies using lesions/internal regions instead of point sources and no Anzai band. For studies with radioactive point sources, they were placed on patients during PET/CT imaging. The motion tracking was performed using a preselected region of interest (ROI), manually drawn around point sources or lesions on reconstructed images. The extracted motion signals were compared with the Anzai band when applicable. For the purposes of additional comparison, a center‐of‐mass (COM) algorithm was implemented both with and without the use of TOF information. Using the motion estimate from each method, amplitude‐based gating was applied, and gated images were reconstructed. Results The TOF‐PEPT algorithm is shown to successfully determine the respiratory motion for both phantom and clinical studies. The derived motion signals correlated well with the Anzai band; correlation coefficients of 0.99 and 0.94‐0.97 were obtained for the phantom study and the clinical studies, respectively. TOF‐PEPT was found to be 13–38% better correlated with the Anzai results than the COM methods. Maximum Standardized Uptake Values (SUVs) were used to quantitatively compare the reconstructed‐gated images. In comparison with the ungated image, a 14–39% increase in the max SUV across several lesion areas and an 8.7% increase in the max SUV on the tracked lesion area were observed in the gated images based on TOF‐PEPT. The distinct presence of lesions with reduced blurring effect and generally sharper images were readily apparent in all clinical studies. In addition, max SUVs were found to be 4–10% higher in the TOF‐PEPT‐based gated images than in those based on Anzai and COM methods. Conclusion A PEPT‐ based algorithm has been presented for determining movement due to respiratory motion during PET/CT imaging. Gating based on the motion estimate is shown to quantifiably improve the image quality in both a controlled point source phantom study and in clinical data patient studies. The algorithm has the...
Objectives Positron emission tomography (PET) is susceptible to patient movement during a scan. Head motion is a continuing problem for brain PET imaging and diagnostic assessments. Physical head restraints and external motion tracking systems are most commonly used to address to this issue. Data-driven methods offer substantial advantages, such as retroactive processing but typically require manual interaction for robustness. In this work, we introduce a time-of-flight (TOF) weighted positron emission particle tracking (PEPT) algorithm that facilitates fully automated, data-driven head motion detection and subsequent automated correction of the raw listmode data. Materials methods We used our previously published TOF-PEPT algorithm Dustin Osborne et al. (2017), Tasmia Rahman Tumpa et al., Tasmia Rahman Tumpa et al. (2021) to automatically identify frames where the patient was near-motionless. The first such static frame was used as a reference to which subsequent static frames were registered. The underlying rigid transformations were estimated using weak radioactive point sources placed on radiolucent glasses worn by the patient. Correction of raw event data were achieved by tracking the point sources in the listmode data which was then repositioned to allow reconstruction of a single image. To create a “gold standard” for comparison purposes, frame-by-frame image registration based correction was implemented. The original listmode data was used to reconstruct an image for each static frame detected by our algorithm and then applying manual landmark registration and external software to merge these into a single image. Results We report on five patient studies. The TOF-PEPT algorithm was configured to detect motion using a 500 ms window. Our event-based correction produced images that were visually free of motion artifacts. Comparison of our algorithm to a frame-based image registration approach produced results that were nearly indistinguishable. Quantitatively, Jaccard similarity indices were found to be in the range of 85-98% for the former and 84-98% for the latter when comparing the static frame images with the reference frame counterparts. Discussion We have presented a fully automated data-driven method for motion detection and correction of raw listmode data. Easy to implement, the approach achieved high temporal resolution and reliable performance for head motion correction. Our methodology provides a mechanism by which patient motion incurred during imaging can be assessed and corrected post hoc.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.