Purpose Accurate assessment of burns is increasingly sought due to diagnostic challenges faced with traditional visual assessment methods. While visual assessment is the most established means of evaluating burns globally, specialised dermatologists are not readily available in most locations and assessment is highly subjective. The use of other technical devices such as Laser Doppler Imaging is highly expensive while rate of occurrences is high in low-and middle-income countries. These necessitate the need for robust and cost-effective assessment techniques thereby acting as an affordable alternative to human expertise. Method In this paper, we present a technique to discriminate skin burns using deep transfer learning. This is due to deficient datasets to train a model from scratch, in which two dense and a classification layers were added to replace the existing top layers of pre-trained ResNet50 model. Results The proposed study was able to discriminate between burns and healthy skin in both ethnic subjects (Caucasians and Africans). We present an extensive analysis of the effect of using both homogeneous and heterogeneous datasets when training a machine learning algorithm. The findings show that using homogenous dataset during training process produces a biased diagnostic model towards minor racial subjects while using heterogeneous datasets produce a robust diagnostic model. Recognition accuracy of up to 97.1% and 99.3% using African and Caucasian datasets respectively were achieved. Conclusion We concluded that it is feasible to have a robust diagnostic machine learning model for burns assessment that can be deployed to remote locations faced with access to specialized burns specialists, thereby aiding in decision-making as quick as possible
Purpose Burns depth evaluation is a lifesaving task and very challenging that requires objective techniques to accomplish. While the visual assessment is the most commonly used by surgeons, its accuracy reliability ranges between 60 and 80% and subjective that lacks any standard guideline. Currently, the only standard adjunct to clinical evaluation of burn depth is Laser Doppler Imaging (LDI) which measures microcirculation within the dermal tissue, providing the burns potential healing time which correspond to the depth of the injury achieving up to 100% accuracy. However, the use of LDI is limited due to many factors including high affordability and diagnostic costs, its accuracy is affected by movement which makes it difficult to assess paediatric patients, high level of human expertise is required to operate the device, and 100% accuracy possible after 72 h. These shortfalls necessitate the need for objective and affordable technique. Method In this study, we leverage the use of deep transfer learning technique using two pretrained models ResNet50 and VGG16 for the extraction of image patterns (ResFeat50 and VggFeat16) from a a burn dataset of 2080 RGB images which composed of healthy skin, first degree, second degree and third-degree burns evenly distributed. We then use One-versus-One Support Vector Machines (SVM) for multi-class prediction and was trained using 10-folds cross validation to achieve optimum trade-off between bias and variance. Results The proposed approach yields maximum prediction accuracy of 95.43% using ResFeat50 and 85.67% using VggFeat16. The average recall, precision and F1-score are 95.50%, 95.50%, 95.50% and 85.75%, 86.25%, 85.75% for both ResFeat50 and VggFeat16 respectively. Conclusion The proposed pipeline achieved a state-of-the-art prediction accuracy and interestingly indicates that decision can be made in less than a minute whether the injury requires surgical intervention such as skin grafting or not.
Malaria is one of the most infectious diseases in the world, particularly in developing continents such as Africa and Asia. Due to the high number of cases and lack of sufficient diagnostic facilities and experienced medical personnel, there is a need for advanced diagnostic procedures to complement existing methods. For this reason, this study proposes the use of machine-learning models to detect the malaria parasite in blood-smear images. Six different features—VGG16, VGG19, ResNet50, ResNet101, DenseNet121, and DenseNet201 models—were extracted. Then Decision Tree, Support Vector Machine, Naïve Bayes, and K-Nearest Neighbour classifiers were trained using these six features. Extensive performance analysis is presented in terms of precision, recall, f-1score, accuracy, and computational time. The results showed that automating the process can effectively detect the malaria parasite in blood samples with an accuracy of over 94% with less complexity than the previous approaches found in the literature.
While visual assessment is the standard technique for burn evaluation, computer-aided diagnosis is increasingly sought due to high number of incidences globally. Patients are increasingly facing challenges which are not limited to shortage of experienced clinicians, lack of accessibility to healthcare facilities and high diagnostic cost. Certain number of studies were proposed in discriminating burn and healthy skin using machine learning leaving a huge and important gap unaddressed; whether burns and related skin injuries can be effectively discriminated using machine learning techniques. Therefore, we specifically use transfer learning by leveraging pre-trained deep learning models due to deficient dataset in this paper, to discriminate two classes of skin injuries—burnt skin and injured skin. Experiments were extensively conducted using three state-of-the-art pre-trained deep learning models that includes ResNet50, ResNet101 and ResNet152 for image patterns extraction via two transfer learning strategies—fine-tuning approach where dense and classification layers were modified and trained with features extracted by base layers and in the second approach support vector machine (SVM) was used to replace top-layers of the pre-trained models, trained using off-the-shelf features from the base layers. Our proposed approach records near perfect classification accuracy in categorizing burnt skin ad injured skin of approximately 99.9%.
While visual assessment is the standard technique for burn evaluation, computer-aided diagnosis is increasingly sought due to high number of incidences globally. Patients are increasingly facing challenges which are not limited to shortage of experienced clinicians, lack of accessibility to healthcare facilities, and high diagnostic cost. Certain number of studies were proposed in discriminating burn and healthy skin using machine learning leaving a huge and important gap unaddressed; whether burns and related skin injuries can be effectively discriminated using machine learning techniques. Therefore, we specifically use pre-trained deep learning models due to deficient dataset to train a new model from scratch. Experiments were extensively conducted using three stateof-the-art pre-trained deep learning models that includes ResNet50, ResNet101 and ResNet152 for image patterns extraction via two transfer learning strategies: fine-tuning approach where dense and classification layers were modified and trained with features extracted by base layers, and in the second approach support vector machine (SVM) was used to replace top-layers of the pre-trained models, trained using off-the-shelf features from the base layers. Our proposed approach records near perfect classification accuracy of approximately 99.9%.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.