Oral cancer is a growing health issue in a number of low- and middle-income countries (LMIC), particularly in South and Southeast Asia. The described dual-modality, dual-view, point-of-care oral cancer screening device, developed for high-risk populations in remote regions with limited infrastructure, implements autofluorescence imaging (AFI) and white light imaging (WLI) on a smartphone platform, enabling early detection of pre-cancerous and cancerous lesions in the oral cavity with the potential to reduce morbidity, mortality, and overall healthcare costs. Using a custom Android application, this device synchronizes external light-emitting diode (LED) illumination and image capture for AFI and WLI. Data is uploaded to a cloud server for diagnosis by a remote specialist through a web app, with the ability to transmit triage instructions back to the device and patient. Finally, with the on-site specialist’s diagnosis as the gold-standard, the remote specialist and a convolutional neural network (CNN) were able to classify 170 image pairs into ‘suspicious’ and ‘not suspicious’ with sensitivities, specificities, positive predictive values, and negative predictive values ranging from 81.25% to 94.94%.
With the goal to screen high-risk populations for oral cancer in low-and middleincome countries (LMICs), we have developed a low-cost, portable, easy to use smartphonebased intraoral dual-modality imaging platform. In this paper we present an image classification approach based on autofluorescence and white light images using deep learning methods. The information from the autofluorescence and white light image pair is extracted, calculated, and fused to feed the deep learning neural networks. We have investigated and compared the performance of different convolutional neural networks, transfer learning, and several regularization techniques for oral cancer classification. Our experimental results demonstrate the effectiveness of deep learning methods in classifying dual-modal images for oral cancer detection.
Early detection of oral cancer in low-resource settings necessitates a Point-of-Care screening tool that empowers Frontline-Health-Workers (FHW). This study was conducted to validate the accuracy of Convolutional-Neural-Network (CNN) enabled m(mobile)-Health device deployed with FHWs for delineation of suspicious oral lesions (malignant/potentially-malignant disorders). The effectiveness of the device was tested in tertiary-care hospitals and low-resource settings in India. The subjects were screened independently, either by FHWs alone or along with specialists. All the subjects were also remotely evaluated by oral cancer specialist/s. The program screened 5025 subjects (Images: 32,128) with 95% (n = 4728) having telediagnosis. Among the 16% (n = 752) assessed by onsite specialists, 20% (n = 102) underwent biopsy. Simple and complex CNN were integrated into the mobile phone and cloud respectively. The onsite specialist diagnosis showed a high sensitivity (94%), when compared to histology, while telediagnosis showed high accuracy in comparison with onsite specialists (sensitivity: 95%; specificity: 84%). FHWs, however, when compared with telediagnosis, identified suspicious lesions with less sensitivity (60%). Phone integrated, CNN (MobileNet) accurately delineated lesions (n = 1416; sensitivity: 82%) and Cloud-based CNN (VGG19) had higher accuracy (sensitivity: 87%) with tele-diagnosis as reference standard. The results of the study suggest that an automated mHealth-enabled, dual-image system is a useful triaging tool and empowers FHWs for oral cancer screening in low-resource settings.
Aim: Globally, India accounts for the highest number of oral cancer cases. The survival rates are about 30% lower than those in developing countries. The main reason for these dismal figures is the late presentation of patients. In order to downstage oral cancer in such a scenario, screening and diagnosis at an early stage is warranted. A pragmatic approach is needed for an oral cancer screening program, hence a mobile health (mHealth) approach was used. In this approach, health workers were empowered with mobile phones with decision-based algorithm. Risk stratification of tobacco habit enables us to identify lesions associated with particular habits. Materials and methods:A specific cohort of factory employees who predominantly had pure tobacco chewing habit was chosen to examine the effect of pure tobacco on oral mucosa. One thousand three hundred and fifty-seven subjects were screened in two phases. In the first phase, habits and oral lesions were identified and photographed. The photographs were remotely diagnosed by an oral medicine specialist and those subjects requiring biopsy were recalled for phase II. Cytology and biopsy were performed in phase II.
Gastrointestinal disease is associated with alterations in the mouth or influence the course of the dental diseases, and the dental health care workers are expected to recognize, diagnose, and treat oral conditions associated with gastrointestinal diseases and also provide safe and appropriate dental care for afflicted individuals. Drugs used in the management of these diseases result in oral adverse effects and also are known to interact with those prescribed during dental care. Hence, this article has reviewed the drug considerations and guidelines for drug use during dental management of patients with gastrointestinal diseases.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.