Dynamic positron emission tomography (PET) imaging usually suffers from high statistical noise due to low counts of the short frames. This study aims to improve the image quality of the short frames by utilizing information from other modality. We develop a deep learning-based joint filtering framework for simultaneously incorporating information from longer acquisition PET frames and high-resolution magnetic resonance (MR) images into the short frames. The network inputs are noisy PET images and corresponding MR images while the outputs are linear coefficients of spatially variant linear representation model. The composite of all dynamic frames is used as training label in each sample, and it is down-sampled to 1/10th of counts as the training input. L1-norm combined with two gradient-based regularizations constitute the loss function during training. Ten realistic dynamic PET/MR phantoms based on BrainWeb are used for pre-training and eleven clinical subjects from Alzheimer's Disease Neuroimaging Initiative further for fine-tuning. Simulation results show that the proposed method can reduce the statistical noise while preserving image details and achieve quantitative enhancements compared with Gaussian, guided filter, and convolutional neural network trained with the mean squared error. The clinical results perform better than others in terms of the mean activity and standard deviation. All of the results indicate that the proposed deep learning-based joint filtering framework is of great potential for dynamic PET image denoising.
Purpose Traditional registration of functional magnetic resonance images (fMRI) is typically achieved through registering their coregistered structural MRI. However, it cannot achieve accurate performance in that functional units which are not necessarily located relative to anatomical structures. In addition, registration methods based on functional information focus on gray matter (GM) information but ignore the importance of white matter (WM). To overcome the limitations of exiting techniques, in this paper, we aim to register resting‐state fMRI (rs‐fMRI) based directly on rs‐fMRI data and make full use of GM and WM information to improve the registration performance. Methods We provide a robust representation of WM functional connectivity features using tissue‐specific patch‐based functional correlation tensors (ts‐PFCTs) as auxiliary information to assist registration. Furthermore, we propose a semi‐supervised deep learning model that uses GM and WM information (GM ts‐PFCTs and WM ts‐PFCTs) during training as a fine tweak to improve registration accuracy when such information is not provided in new test image pairs. We implement our method on the 1000 Functional Connectomes Project dataset. To evaluate our method, a group‐level analysis was implemented in resting‐state brain functional networks after registration, resulting in t maps. Results Our method increases the peak t values of the t maps of default mode network, visual network, central executive network, and sensorimotor network to 21.4, 20.0, 18.4, and 19.0, respectively. Through comparison with traditional methods (FMRIB Software Library(FSL), Statistical Parametric Mapping _ Echo Planar Image(SPM_EPI), and SPM_T1), our method achieves an average improvement of 67.39%, 12.96%, and 25.14%. Conclusion We propose a semi‐supervised deep learning network by adding GM and WM information as auxiliary information for resting‐state fMRI registration. GM and WM information is extracted and described as GM ts‐PFCTs and WM ts‐PFCTs. Experimental results show that our method achieves superior registration performance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.