Background: Artificial intelligence (AI) is about to transform medical imaging. The Research Consortium for Medical Image Analysis (RECOMIA), a not-for-profit organisation, has developed an online platform to facilitate collaboration between medical researchers and AI researchers. The aim is to minimise the time and effort researchers need to spend on technical aspects, such as transfer, display, and annotation of images, as well as legal aspects, such as de-identification. The purpose of this article is to present the RECOMIA platform and its AI-based tools for organ segmentation in computed tomography (CT), which can be used for extraction of standardised uptake values from the corresponding positron emission tomography (PET) image. Results: The RECOMIA platform includes modules for (1) local de-identification of medical images, (2) secure transfer of images to the cloud-based platform, (3) display functions available using a standard web browser, (4) tools for manual annotation of organs or pathology in the images, (5) deep learning-based tools for organ segmentation or other customised analyses, (6) tools for quantification of segmented volumes, and (7) an export function for the quantitative results. The AI-based tool for organ segmentation in CT currently handles 100 organs (77 bones and 23 soft tissue organs). The segmentation is based on two convolutional neural networks (CNNs): one network to handle organs with multiple similar instances, such as vertebrae and ribs, and one network for all other organs. The CNNs have been trained using CT studies from 339 patients. Experienced radiologists annotated organs in the CT studies. The performance of the segmentation tool, measured as mean Dice index on a manually annotated test set, with 10 representative organs, was 0.93 for all foreground voxels, and the mean Dice index over the organs were 0.86 (0.82 for the soft tissue organs and 0.90 for the bones). Conclusion: The paper presents a platform that provides deep learning-based tools that can perform basic organ segmentations in CT, which can then be used to automatically obtain the different measurement in the corresponding PET image. The RECOMIA platform is available on request at www.recomia.org for research purposes.
Aim To validate a deep-learning (DL) algorithm for automated quantification of prostate cancer on positron emission tomography/computed tomography (PET/ CT) and explore the potential of PET/CT measurements as prognostic biomarkers. Material and methods Training of the DL-algorithm regarding prostate volume was performed on manually segmented CT images in 100 patients. Validation of the DL-algorithm was carried out in 45 patients with biopsy-proven hormone-na€ ıve prostate cancer. The automated measurements of prostate volume were compared with manual measurements made independently by two observers. PET/CT measurements of tumour burden based on volume and SUV of abnormal voxels were calculated automatically. Voxels in the co-registered 18 F-choline PET images above a standardized uptake value (SUV) of 2Á65, and corresponding to the prostate as defined by the automated segmentation in the CT images, were defined as abnormal. Validation of abnormal voxels was performed by manual segmentation of radiotracer uptake. Agreement between algorithm and observers regarding prostate volume was analysed by Sørensen-Dice index (SDI). Associations between automatically based PET/CT biomarkers and age, prostate-specific antigen (PSA), Gleason score as well as overall survival were evaluated by a univariate Cox regression model. Results The SDI between the automated and the manual volume segmentations was 0Á78 and 0Á79, respectively. Automated PET/CT measures reflecting total lesion uptake and the relation between volume of abnormal voxels and total prostate volume were significantly associated with overall survival (P = 0Á02), whereas age, PSA, and Gleason score were not. Conclusion Automated PET/CT biomarkers showed good agreement to manual measurements and were significantly associated with overall survival.
Background Body composition is associated with survival outcome in oncological patients, but it is not routinely calculated. Manual segmentation of subcutaneous adipose tissue (SAT) and muscle is time-consuming and therefore limited to a single CT slice. Our goal was to develop an artificial-intelligence (AI)-based method for automated quantification of three-dimensional SAT and muscle volumes from CT images. Methods Ethical approvals from Gothenburg and Lund Universities were obtained. Convolutional neural networks were trained to segment SAT and muscle using manual segmentations on CT images from a training group of 50 patients. The method was applied to a separate test group of 74 cancer patients, who had two CT studies each with a median interval between the studies of 3 days. Manual segmentations in a single CT slice were used for comparison. The accuracy was measured as overlap between the automated and manual segmentations. Results The accuracy of the AI method was 0.96 for SAT and 0.94 for muscle. The average differences in volumes were significantly lower than the corresponding differences in areas in a single CT slice: 1.8% versus 5.0% (p < 0.001) for SAT and 1.9% versus 3.9% (p < 0.001) for muscle. The 95% confidence intervals for predicted volumes in an individual subject from the corresponding single CT slice areas were in the order of ± 20%. Conclusions The AI-based tool for quantification of SAT and muscle volumes showed high accuracy and reproducibility and provided a body composition analysis that is more relevant than manual analysis of a single CT slice.
Introduction Lymph node metastases are a key prognostic factor in prostate cancer (PCa), but detecting lymph node lesions from PET/CT images is a subjective process resulting in inter‐reader variability. Artificial intelligence (AI)‐based methods can provide an objective image analysis. We aimed at developing and validating an AI‐based tool for detection of lymph node lesions. Methods A group of 399 patients with biopsy‐proven PCa who had undergone 18F‐choline PET/CT for staging prior to treatment were used to train (n = 319) and test (n = 80) the AI‐based tool. The tool consisted of convolutional neural networks using complete PET/CT scans as inputs. In the test set, the AI‐based lymph node detections were compared to those of two independent readers. The association with PCa‐specific survival was investigated. Results The AI‐based tool detected more lymph node lesions than Reader B (98 vs. 87/117; p = .045) using Reader A as reference. AI‐based tool and Reader A showed similar performance (90 vs. 87/111; p = .63) using Reader B as reference. The number of lymph node lesions detected by the AI‐based tool, PSA, and curative treatment was significantly associated with PCa‐specific survival. Conclusion This study shows the feasibility of using an AI‐based tool for automated and objective interpretation of PET/CT images that can provide assessments of lymph node lesions comparable with that of experienced readers and prognostic information in PCa patients.
Summary Aim To test the feasibility of a fully automated artificial intelligence‐based method providing PET measures of prostate cancer (PCa). Methods A convolutional neural network (CNN) was trained for automated measurements in 18F‐choline (FCH) PET/CT scans obtained prior to radical prostatectomy (RP) in 45 patients with newly diagnosed PCa. Automated values were obtained for prostate volume, maximal standardized uptake value (SUVmax), mean standardized uptake value of voxels considered abnormal (SUVmean) and volume of abnormal voxels (Volabn). The product SUVmean × Volabn was calculated to reflect total lesion uptake (TLU). Corresponding manual measurements were performed. CNN‐estimated data were compared with the weighted surgically removed tissue specimens and manually derived data and related to clinical parameters assuming that 1 g ≈ 1 ml of tissue. Results The mean (range) weight of the prostate specimens was 44 g (20–109), while CNN‐estimated volume was 62 ml (31–108) with a mean difference of 13·5 g or ml (95% CI: 9·78–17·32). The two measures were significantly correlated (r = 0·77, P<0·001). Mean differences (95% CI) between CNN‐based and manually derived PET measures of SUVmax, SUVmean, Volabn (ml) and TLU were 0·37 (−0·01 to 0·75), −0·08 (−0·30 to 0·14), 1·40 (−2·26 to 5·06) and 9·61 (−3·95 to 23·17), respectively. PET findings Volabn and TLU correlated with PSA (P<0·05), but not with Gleason score or stage. Conclusion Automated CNN segmentation provided in seconds volume and simple PET measures similar to manually derived ones. Further studies on automated CNN segmentation with newer tracers such as radiolabelled prostate‐specific membrane antigen are warranted.
Here, we aimed to develop and validate a fully automated artificial intelligence (AI)-based method for the detection and quantification of suspected prostate tumour/local recurrence, lymph node metastases, and bone metastases from [18F]PSMA-1007 positron emission tomography-computed tomography (PET-CT) images. Images from 660 patients were included. Segmentations by one expert reader were ground truth. A convolutional neural network (CNN) was developed and trained on a training set, and the performance was tested on a separate test set of 120 patients. The AI method was compared with manual segmentations performed by several nuclear medicine physicians. Assessment of tumour burden (total lesion volume (TLV) and total lesion uptake (TLU)) was performed. The sensitivity of the AI method was, on average, 79% for detecting prostate tumour/recurrence, 79% for lymph node metastases, and 62% for bone metastases. On average, nuclear medicine physicians’ corresponding sensitivities were 78%, 78%, and 59%, respectively. The correlations of TLV and TLU between AI and nuclear medicine physicians were all statistically significant and ranged from R = 0.53 to R = 0.83. In conclusion, the development of an AI-based method for prostate cancer detection with sensitivity on par with nuclear medicine physicians was possible. The developed AI tool is freely available for researchers.
Rationale: Standardized staging and quantitative reporting is necessary to demonstrate the association of 18 F-DCFPyL PET/CT (PSMA) imaging with clinical outcome. This work introduces an automated platform to implement and extend the Prostate Cancer Molecular Imaging Standardized Evaluation (PROMISE) criteria -aPROMISE. The objective is to validate the performance of aPROMISE in staging and quantifying disease burden in patients with prostate cancer who undergo PSMA Imaging.Methods: This was a retrospective analysis of 109 Veterans with intermediate and high-risk prostate cancer, who underwent PSMA imaging. To validate the performance of aPROMISE, two independent nuclear-medicine physicians conducted aPROMISEassisted reads, resulting in standardized reports that quantify individual lesions and stage the patients. Patients were staged as having local only disease (miN0M0); regional lymph node only (miN1M0), metastatic disease only (miN0M1), and with both regional and distant metastatic disease (miN1M1). The staging obtained from aPROMISE-assisted reads was compared with the staging by conventional imaging. Cohen's pairwise kappa agreement was used to evaluate the inter-reader variability. Correlation coefficient and ICC was used to evaluate the inter-reader variability of the quantitative assessment (miPSMA-index) in each stage. Kendall Tau and t-test was used to evaluate the association of miPSMA-index with PSA and Gleason Score.Results: All PSMA images of 109 veterans met the DICOM conformity and the requirements for the aPROMISE analysis. Both independent aPROMISE-assisted analyses demonstrated significant upstaging in patients with localized (23%; N=20/87) and regional tumor burden (25%; N=2/8). However, a significant number of patients with bone metastases identified on conventional imaging (NaF PET/CT) were downstaged (29%; N=4/14). The comparison of the two independent aPROMISE-assisted reads demonstrated a high kappa agreement -0.82 (miN0M0), 0.90 (miN1M0), and 0.77 (miN0M1). The Spearman correlation of quantitative miPSMA-index was 0.93, 0.96 and 0.97, respectively. As a continuous variable, miPSMA index in the prostate (miT) was associated with risk groups defined by the PSA and Gleason.. Conclusion:Here we demonstrate consistency of the aPROMISE platform between readers and observed substantial upstaging in PSMA imaging compared to the conventional imaging. aPROMISE may contribute to the broader standardization of PSMA imaging assessment and to its clinical utility in management of prostate cancer patients.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.