The purpose of this work is to identify the pectoral muscle region in mediolateral oblique (MLO) view mammograms even when the boundary is blurred or obscured. The problem is decoupled into two subproblems in our study: identifying parts of boundaries with high confidence and predicting the overall shape of the pectoral muscle. Due to the similarity in intensity and texture between pectoral muscle and gland tissue, we trained a deep neural network to distinguish them in the first subproblem. The boundary with high confidence can be obtained according to the consistency of predictions from multiple converged models. For the shape prediction problem, a generative adversarial network (GAN) is used to learn mapping from a given identified region and the breast shape to the overall pectoral muscle shape. Our method is evaluated on a mammogram dataset including 633 MLO view mammograms collected from three different datacenters. We take U-Net as our baseline model and the dataset is divided into three groups according to the performance of U-Net for evaluation. In all three groups, U-Net achieves 80.1%, 92.9%, and 98.3% in the Dice similarity coefficient, respectively, and our method achieves 85.2%, 94.8%, and 98.1% in the Dice similarity coefficient, respectively. The experiment shows that our method effectively estimates the pectoral muscle boundary, even parts of boundaries that are difficult to detect, and greatly improves the performance of segmentation in this case.
Purpose The Breast Imaging‐Reporting and Data System (BI‐RADS) for ultrasound imaging provides a widely used reporting schema for breast imaging. Previous studies have shown that in ultrasound imaging, 90% of BI‐RADS 4A tumors are benign lesions after biopsies. Unnecessary biopsy procedures can be avoided by accurate classification of BI‐RADS 4A tumors. However, the classification task is challenging and has not been fully investigated by existing studies. For benign and malignant tumors of BI‐RADS 4A, the appearances of intra‐class tumors are highly variable, the characteristics of inter‐class tumors is overall‐similar. Discriminative features need to be found to improve classification accuracy of BI‐RADS 4A tumors. Methods In this study, we designed the network using the clinical features of BI‐RADS 4A tumors to improve the discrimination ability of network. The boundary information is embedded into the input of the network using the uncertainty. A fine‐grained data augmentation method is used to find discriminative features in tumor information embedded with boundary information. Two mathematical methods, voting‐based and variance‐based, are used to define the uncertainty of boundary, and the differences of these two definitions are compared in a classification network. Results The dataset we used to evaluate our method had 1155 2D grayscale images. Each image represented a unique BI‐RADS 4A tumor. Among them, 248 tumors were proven to be malignant by biopsy, and the remaining 907 were benign. A weakly supervised data augmentation network (WS‐DAN) was used as the backbone classification network, which showed competitive performance in finding discriminative features. Using the auxiliary input of the uncertain boundaries defined by the voting method, the area under the curve (AUC) value of our method was 0.8347 (sensitivity = 0.7774, specificity = 0.7459). The AUC value of the variance‐based uncertainty was 0.7789. The voting‐based uncertainty was higher than the baseline (AUC = 0.803), which only inputs the original image. Compared with the classic classification network, our method had a significant effect improvement (p < 0.01). Conclusions Using the uncertain boundaries defined by the voting methods as auxiliary information, we obtained a better performance in the classification of BI‐RADS 4A ultrasound images, while variance‐based uncertain boundaries had no effect on improving classification performance. Additionally, fine‐grained network helped find discriminative features comparing with the commonly used classification networks.
Microvascular invasion (MVI) is a reliable predictor of the survival of patients with hepatocellular carcinoma (HCC). Accurate preoperative MVI assessment is essential to determine the appropriate surgical approach and management strategy to decrease the HCC recurrence rate. In this study, a preoperative evaluation method was proposed based on a convolutional neural network (CNN) model. Using Computed Tomography (CT) volume data, the relationship between CT volume data and MVI can be explored based on a multi-modal and multi-response CNN via an end-to-end model. A total of 400 patients were included in this study. First, the arterial phase (AP) and venous phase (VP) volume data were used as the inputs of the model; The size of the input was arbitrary and the inputs was converted to the same size by spatial pyramid pooling (SPP) behind. Then, these features were merged by the multi-modal network. The features of the AP and VP were combined through the multi-modal fusion of decision-making layers. Of the 400 patients, 215 (53.75%) and 185 (46.25%) are MVI-positive and MVI-negative cases, respectively. The areas under the receiver operating characteristic curves of the three-dimensional (3D) CNN model corresponding to the training and testing sets were 0.904 and 0.893, respectively. In the test set, 88.89% of the MVI-negative cases (16/18) and 86.37% of the MVI-positive cases (19/22) were detected. The evaluation results indicated a considerable potential feature correlation between CT volume data and MVI. The proposed multi-modal and multi-response CNN model had positive effect on the preoperative evaluation of MVI.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.