Background and study aims Computer-aided diagnostic tools using deep neural networks are efficient for detection of lesions in endoscopy but require a huge number of images. The impact of the quality of annotation has not been tested yet. Here we describe a multi-expert annotated dataset of images extracted from capsules from Crohn’s disease patients and the impact of the quality of annotations on the accuracy of a recurrent attention neural network. Methods Images of capsule were annotated by a reader first and then reviewed by three experts in inflammatory bowel disease. Concordance analysis between experts was evaluated by Fleiss’ kappa and all the discordant images were, again, read by all the endoscopists to obtain a consensus annotation. A recurrent attention neural network developed for the study was tested before and after the consensus annotation. Available neural networks (ResNet and VGGNet) were also tested under the same conditions. Results The final dataset included 3498 images with 2124 non-pathological (60.7 %), 1360 pathological (38.9 %), and 14 (0.4 %) inconclusive. Agreement of the experts was good for distinguishing pathological and non-pathological images with a kappa of 0.79 (P < 0.0001). The accuracy of our classifier and the available neural networks increased after the consensus annotation with a precision of 93.7 %, sensitivity of 93 %, and specificity of 95 %. Conclusions The accuracy of the neural network increased with improved annotations, suggesting that the number of images needed for the development of these systems could be diminished using a well-designed dataset.
Wireless capsule endoscopy (WCE) allows medical doctors to examine the interior of the small intestine with a noninvasive procedure. This methodology is particularly important for Crohn's disease (CD), where an early diagnosis improves treatment outcomes. The counting and identification of CD lesions in WCE videos is a time-consuming process for medical experts. In the era of deep-learning many automatic WCE lesion classifiers, requiring annotated data, have been developed. However, benchmarking classifiers is difficult due to the lack of standard evaluation data. Most detection algorithms are evaluated on private datasets or on unspecified subsets of public databases. To help the development and comparison of automatic CD lesion classifiers, we release CrohnIPI, a dataset of 3498 images, independently reviewed by several experts. It contains 60.55% of non-pathological images and 38.85% of pathological images with 7 different types of CD lesions. A part of these images are multilabeled. The dataset is balanced between pathological images and non-pathological ones and split into two subsets for training and testing models. This database will progressively be enriched over the next few years in aim to make the automatic detection algorithms converge to the most accurate system possible and to consolidate their evaluation.
Wireless capsule endoscopy (WCE) allows medical doctors to examine the interior of the small intestine with a noninvasive procedure. This methodology is particularly important for Crohn's disease (CD), where an early diagnosis improves treatment outcomes. However, the viewing and evaluation of WCE videos is a time-consuming process for the medical experts. In this work, we present a recurrent attention neural network for the detection in WCE images of CD lesions in the small bowel. Our classifier reaches 90.85% accuracy on our own dataset annotated by experts from the Hospital of Nantes. The model has also been tested on a public endoscopic dataset, the CAD-CAP database used for the GIANA competition, and achieves high performance on detection task with an accuracy of 99,67%. This automatic lesion classifier will greatly reduce the amount of time spent by gastroenterologists in reviewing WCE videos, which will likely foster the development of this technique and speed-up the diagnosis of CD.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.