BackgroundIn this study, images of 2450 benign thyroid nodules and 2557 malignant thyroid nodules were collected and labeled, and an automatic image recognition and diagnosis system was established by deep learning using the YOLOv2 neural network. The performance of the system in the diagnosis of thyroid nodules was evaluated, and the application value of artificial intelligence in clinical practice was investigated.MethodsThe ultrasound images of 276 patients were retrospectively selected. The diagnoses of the radiologists were determined according to the Thyroid Imaging Reporting and Data System; the images were automatically recognized and diagnosed by the established artificial intelligence system. Pathological diagnosis was the gold standard for the final diagnosis. The performances of the established system and the radiologists in diagnosing the benign and malignant thyroid nodules were compared.ResultsThe artificial intelligence diagnosis system correctly identified the lesion area, with an area under the receiver operating characteristic (ROC) curve of 0.902, which is higher than that of the radiologists (0.859). This finding indicates a higher diagnostic accuracy (p = 0.0434). The sensitivity, positive predictive value, negative predictive value, and accuracy of the artificial intelligence diagnosis system for the diagnosis of malignant thyroid nodules were 90.5%, 95.22%, 80.99%, and 90.31%, respectively, and the performance did not significantly differ from that of the radiologists (p > 0.05). The artificial intelligence diagnosis system had a higher specificity (89.91% vs 77.98%, p = 0.026).ConclusionsCompared with the performance of experienced radiologists, the artificial intelligence system has comparable sensitivity and accuracy for the diagnosis of malignant thyroid nodules and better diagnostic ability for benign thyroid nodules. As an auxiliary tool, this artificial intelligence diagnosis system can provide radiologists with sufficient assistance in the diagnosis of benign and malignant thyroid nodules.
MRI is the gold standard for confirming a pelvic lymph node metastasis diagnosis. Traditionally, medical radiologists have analyzed MRI image features of regional lymph nodes to make diagnostic decisions based on their subjective experience; this diagnosis lacks objectivity and accuracy. This study trained a faster region-based convolutional neural network (Faster R-CNN) with 28,080 MRI images of lymph node metastasis, allowing the Faster R-CNN to read those images and to make diagnoses. For clinical verification, 414 cases of rectal cancer at various medical centers were collected, and Faster R-CNN-based diagnoses were compared with radiologist diagnoses using receiver operating characteristic curves (ROC). The area under the Faster R-CNN ROC was 0.912, indicating a more effective and objective diagnosis. The Faster R-CNN diagnosis time was 20 s/case, which was much shorter than the average time (600 s/case) of the radiologist diagnoses. Faster R-CNN enables accurate and efficient diagnosis of lymph node metastases. .
Background: Artificial intelligence-assisted image recognition technology is currently able to detect the target area of an image and fetch information to make classifications according to target features. This study aimed to use deep neural networks for computed tomography (CT) diagnosis of perigastric metastatic lymph nodes (PGMLNs) to simulate the recognition of lymph nodes by radiologists, and to acquire more accurate identification results. Methods: A total of 1371 images of suspected lymph node metastasis from enhanced abdominal CT scans were identified and labeled by radiologists and were used with 18,780 original images for faster region-based convolutional neural networks (FR-CNN) deep learning. The identification results of 6000 random CT images from 100 gastric cancer patients by the FR-CNN were compared with results obtained from radiologists in terms of their identification accuracy. Similarly, 1004 CT images with metastatic lymph nodes that had been post-operatively confirmed by pathological examination and 11,340 original images were used in the identification and learning processes described above. The same 6000 gastric cancer CT images were used for the verification, according to which the diagnosis results were analyzed. Results: In the initial group, precision-recall curves were generated based on the precision rates, the recall rates of nodule classes of the training set and the validation set; the mean average precision (mAP) value was 0.5019. To verify the results of the initial learning group, the receiver operating characteristic curves was generated, and the corresponding area under the curve (AUC) value was calculated as 0.8995. After the second phase of precise learning, all the indicators were improved, and the mAP and AUC values were 0.7801 and 0.9541, respectively. Conclusion: Through deep learning, FR-CNN achieved high judgment effectiveness and recognition accuracy for CT diagnosis of PGMLNs. Trial Registration: Chinese Clinical Trial Registry, No. ChiCTR1800016787; http://www.chictr.org.cn/showproj.aspx?proj=28515.
HighlightsClonal evolution of colorectal tumors followed Darwinian pattern of evolution Intratumor heterogeneity in patients with rightsided and left-sided colon and rectal cancer Evolution of left-sided colon cancer and rectal cancer was more complex and divergent Lymph node metastasis and extranodal tumor deposits were of polyclonal in origin
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.