ObjectiveTo develop a deep convolutional neural network (DCNN) that can automatically detect laryngeal cancer (LCA) in laryngoscopic images.MethodsA DCNN-based diagnostic system was constructed and trained using 13,721 laryngoscopic images of LCA, precancerous laryngeal lesions (PRELCA), benign laryngeal tumors (BLT) and normal tissues (NORM) from 2 tertiary hospitals in China, including 2293 from 206 LCA subjects, 1807 from 203 PRELCA subjects, 6448 from 774 BLT subjects and 3191 from 633 NORM subjects. An independent test set of 1176 laryngoscopic images from other 3 tertiary hospitals in China, including 132 from 44 LCA subjects, 129 from 43 PRELCA subjects, 504 from 168 BLT subjects and 411 from 137 NORM subjects, was applied to the constructed DCNN to evaluate its performance against experienced endoscopists.ResultsThe DCCN achieved a sensitivity of 0.731, a specificity of 0.922, an AUC of 0.922, and the overall accuracy of 0.867 for detecting LCA and PRELCA among all lesions and normal tissues. When compared to human experts in an independent test set, the DCCN’ s performance on detection of LCA and PRELCA achieved a sensitivity of 0.720, a specificity of 0.948, an AUC of 0.953, and the overall accuracy of 0.897, which was comparable to that of an experienced human expert with 10–20 years of work experience. Moreover, the overall accuracy of DCNN for detection of LCA was 0.773, which was also comparable to that of an experienced human expert with 10–20 years of work experience and exceeded the experts with less than 10 years of work experience.ConclusionsThe DCNN has high sensitivity and specificity for automated detection of LCA and PRELCA from BLT and NORM in laryngoscopic images. This novel and effective approach facilitates earlier diagnosis of early LCA, resulting in improved clinical outcomes and reducing the burden of endoscopists.
ObjectivesThis study investigated the usefulness and performance of a two-stage attention-aware convolutional neural network (CNN) for the automated diagnosis of otitis media from tympanic membrane (TM) images.DesignA classification model development and validation study in ears with otitis media based on otoscopic TM images. Two commonly used CNNs were trained and evaluated on the dataset. On the basis of a Class Activation Map (CAM), a two-stage classification pipeline was developed to improve accuracy and reliability, and simulate an expert reading the TM images.Setting and participantsThis is a retrospective study using otoendoscopic images obtained from the Department of Otorhinolaryngology in China. A dataset was generated with 6066 otoscopic images from 2022 participants comprising four kinds of TM images, that is, normal eardrum, otitis media with effusion (OME) and two stages of chronic suppurative otitis media (CSOM).ResultsThe proposed method achieved an overall accuracy of 93.4% using ResNet50 as the backbone network in a threefold cross-validation. The F1 Score of classification for normal images was 94.3%, and 96.8% for OME. There was a small difference between the active and inactive status of CSOM, achieving 91.7% and 82.4% F1 scores, respectively. The results demonstrate a classification performance equivalent to the diagnosis level of an associate professor in otolaryngology.ConclusionsCNNs provide a useful and effective tool for the automated classification of TM images. In addition, having a weakly supervised method such as CAM can help the network focus on discriminative parts of the image and improve performance with a relatively small database. This two-stage method is beneficial to improve the accuracy of diagnosis of otitis media for junior otolaryngologists and physicians in other disciplines.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.