“…z (t) nk , (A. 7) and the weight coefficient η kτ is given by η (t+1) kτ � N n�1 z (t) nk y (t) nkτ N n�1 z (t) nk 2 m�1 y (t) nkm .…”
Section: Discussionmentioning
confidence: 99%
“…In the last few decades, many methods have been raised to solve the overlapping cervical cell segmentation problem [7,8]. For example, Ushizima et al [9] proposed an unsupervised method which combines superpixel with Voronoi technique to detect cervical cells.…”
Accuracy segmentation of the nuclei and cytoplasm in Pap smear images is challenging in cervix cytological analysis. In this paper, a new fusion algorithm combining the asymmetric generalized Gaussian and Cauchy mixture model (GGCMM) with a shape constraint level set method to segment overlapping cervical smear cells is put forward. The proposed approach starts by separating nuclei and cytoplasm cluster through asymmetric GGCMM, where each component is a mixture of generalized Gaussian distribution and Cauchy distribution. The proposed asymmetric GGCMM takes into account the asymmetry of generalized Gaussian distribution and the heavier tail of Cauchy distribution. New probability distribution fits different shapes of observed data more flexibly. Then, we apply the morphological operation to remove fake nuclei which is usually much smaller than real nuclei. After that, the improved level set energy function with a distance map and a new shape prior term are applied to extract the contours of overlapping cervical cells. Due to this new level set energy function, the segmentation of every individual cell worked well, especially in overlapping areas. We evaluate the proposed method by using the ISBI 2014 Challenge Dataset. The results demonstrate that our approach outperforms existing methods in extracting overlapping cervical cells and obtains accurate cell contours.
“…z (t) nk , (A. 7) and the weight coefficient η kτ is given by η (t+1) kτ � N n�1 z (t) nk y (t) nkτ N n�1 z (t) nk 2 m�1 y (t) nkm .…”
Section: Discussionmentioning
confidence: 99%
“…In the last few decades, many methods have been raised to solve the overlapping cervical cell segmentation problem [7,8]. For example, Ushizima et al [9] proposed an unsupervised method which combines superpixel with Voronoi technique to detect cervical cells.…”
Accuracy segmentation of the nuclei and cytoplasm in Pap smear images is challenging in cervix cytological analysis. In this paper, a new fusion algorithm combining the asymmetric generalized Gaussian and Cauchy mixture model (GGCMM) with a shape constraint level set method to segment overlapping cervical smear cells is put forward. The proposed approach starts by separating nuclei and cytoplasm cluster through asymmetric GGCMM, where each component is a mixture of generalized Gaussian distribution and Cauchy distribution. The proposed asymmetric GGCMM takes into account the asymmetry of generalized Gaussian distribution and the heavier tail of Cauchy distribution. New probability distribution fits different shapes of observed data more flexibly. Then, we apply the morphological operation to remove fake nuclei which is usually much smaller than real nuclei. After that, the improved level set energy function with a distance map and a new shape prior term are applied to extract the contours of overlapping cervical cells. Due to this new level set energy function, the segmentation of every individual cell worked well, especially in overlapping areas. We evaluate the proposed method by using the ISBI 2014 Challenge Dataset. The results demonstrate that our approach outperforms existing methods in extracting overlapping cervical cells and obtains accurate cell contours.
“…In [43], first obtaining regions of interest for cell nuclei segmentation by applying the Mean-Shift clustering algorithm, and then applying mathematical morphology to split overlapped cell nuclei for better accuracy and robustness.…”
Cervical cytology screening using Pap smear or liquid-based cytology is one of the most widely followed and accepted method. Automation-assisted screening based on cervical cytology has become a necessity due to the manual screening method operated by a visual analysis for cervical cell specimen under the microscope of the glass slide is usually labor-intensive and time-consuming. While automation-assisted reading system can improve efficiency, their performance often relies on the success of accurate cell segmentation and hand-craft feature extraction. This paper presents an efficient and totally segmentation-free method for automated cervical cell screening that utilizes modern object detector to directly detect cervical cells or clumps, without the design of specific hand-crafted feature. Specifically, we use the state-of-the-art CNN-based object detection methods, YOLOv3, as our baseline model. In order to improve the classification performance of hard examples which are four highly similar categories, we cascade an additional task-specific classifier. We also investigate the presence of unreliable annotations and coped with them by smoothing the distribution of noisy labels.We comprehensively evaluate our methods on our test set which is consisted of 1,014 annotated cervical cell images with size of 4000×3000 and complex cellular situation corresponding to 10 categories.Our model achieves 97.5% sensitivity (Sens) and 67.8% specificity (Spec) on cervical cell image-level screening. Moreover, we obtain a best mean Average Precision (mAP) of 63.4% on cervical cell-level diagnosis, and improve the Average Precision (AP) of hard examples which are the most valuable but most difficult to distinguish. Our automation-assisted cervical cell reading system not only achieves cervical cell image-level classification but also provides more detailed location and category reference information of abnormal cells. The results indicate feasible performance of our method, together with the efficiency and robustness, providing a new idea for future development of computer-assisted reading systems in clinical cervical screening.
“…A series of features of the nucleus and cytoplasm were extracted and used for training including shape-based features, statistical features and so on [7]- [11]. The features were selected by chain-like agent genetic (CAGA) [12], apriori-like feature selection [13], clustering [14] and used for different classifiers. Accurate cell segmentation is crucial to the performance of the systems.…”
Automatic classification of cervical Pap smear images plays a key role in computer-aided cervical cancer diagnosis. Conventional classification approaches rely on cell segmentation and feature extraction methods. Due to overlapping cells, dust, impurities and uneven irradiation, the accurate segmentation and feature extraction of Pap smear images are still challenging. To overcome the difficulties of the feature-based approaches, deep learning is becoming more important alternative. Since the number of cervical cytological images is limited, an adaptive pruning deep transfer learning model (PsiNet-TAP) is proposed for Pap smear images classification. We designed a novel network to classify Pap smear images. Due to the limited number of images, we adopted transfer learning to obtain the pre-trained model. Then it was optimized by modifying the convolution layer and pruning some convolution kernels that may interfere with the target classification task. The proposed method PsiNet-TAP was tested on 389 cervical Pap smear images. The method has achieved remarkable performance (accuracy: more than 98%), which demonstrates the strength of the proposed method for providing an efficient tool for cervical cancer classification in clinical settings. INDEX TERMS Adaptive pruning, cervical smear images, convolutional neural networks, transfer learning, uninvolved images.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.