The benchmark method for the evaluation of breast cancers involves microscopic testing of a hematoxylin and eosin (H&E)‐stained tissue biopsy. Resurgery is required in 20% to 30% of cases because of incomplete excision of malignant tissues. Therefore, a more accurate method is required to detect the cancer margin to avoid the risk of recurrence. In the recent years, convolutional neural networks (CNNs) has achieved excellent performance in the field of medical images diagnosis. It automatically extracts the features from the images and classifies them. In the proposed study, we apply a pretrained Inception‐v3 CNN with reverse active learning for the classification of healthy and malignancy breast tissue using optical coherence tomography (OCT) images. This proposed method attained the sensitivity, specificity and accuracy is 90.2%, 91.7% and 90%, respectively, with testing datasets collected from 48 patients (22 normal fibro‐adipose tissue and 26 Invasive ductal carcinomas cancerous tissues). The trained network utilizes for the breast cancer margin assessment to predict the tumor with negative margins. Additionally, the network output is correlated with the corresponding histology image. Our results lay the foundation for the future that the proposed method can be used to perform automatic intraoperative identification of breast cancer margins in real‐time and to guide core needle biopsies.
We report the first fully automated detection of human skin burn injuries in vivo, with the goal of automatic surgical margin assessment based on optical coherence tomography (OCT) images. Our proposed automated procedure entails building a machine-learning-based classifier by extracting quantitative features from normal and burn tissue images recorded by OCT. In this study, 56 samples (28 normal, 28 burned) were imaged by OCT and eight features were extracted. A linear model classifier was trained using 34 samples and 22 samples were used to test the model. Sensitivity of 91.6% and specificity of 90% were obtained. Our results demonstrate the capability of a computer-aided technique for accurately and automatically identifying burn tissue resection margins during surgical treatment.
The second most leading cause of death in women is breast cancer even after surgery 20%–30% patients require re-surgery for the removal of the incomplete malignant tissue. To reduce the risk of recurrence exact navigation during surgery is essential. In this paper, a multi-level ensemble model is developed to classify the normal and cancerous breast tissue using features extracted from optical coherence tomography images. Both A-line and B-scan features were extracted and the training data feature space was best-first search to select the best feature subset for each base classifier. To improve the performance of the model reverse active learning process is used to remove the mislabelled images. The ensemble classifier attained the average sensitivity, specificity, accuracy and Mathew correlation coefficient is 0.954%, 0.93%, 94.35%, and 0.887%, respectively with testing data in detecting the breast tissues collected from 24 patients (11 normal fibro-adipose tissue and 13 invasive ductal carcinomas cancerous tissues). In comparison to individual classifier, multi-level ensemble model based on best-first search algorithm achieved a high performance in terms of performance metrics. This study demonstrates the automated tissue characterization based on OCT measurements of cancer tissue, will become a biomarker for the complete excision of malignant tissue and also does not require any special tissue preparation.
Malaria is a life-threatening infectious blood disease affecting humans and other animals caused by parasitic protozoans belonging to the Plasmodium type especially in developing countries. The gold standard method for the detection of malaria is through the microscopic method of chemically treated blood smears. We developed an automated optical spatial coherence tomographic system using a machine learning approach for a fast identification of malaria cells. In this study, 28 samples (15 healthy, 13 malaria infected stages of red blood cells) were imaged by the developed system and 13 features were extracted. We designed a multilevel ensemble-based classifier for the quantitative prediction of different stages of the malaria cells. The proposed classifier was used by repeating k-fold cross validation dataset and achieve a high-average accuracy of 97.9% for identifying malaria infected late trophozoite stage of cells. Overall, our proposed system and multilevel ensemble model has a substantial quantifiable potential to detect the different stages of malaria infection without staining or expert.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.