Smartphone wound image analysis has recently emerged as a viable way to assess healing progress and provide actionable feedback to patients and caregivers between hospital appointments. Segmentation is a key image analysis step, after which attributes of the wound segment (e.g. wound area and tissue composition) can be analyzed. The Associated Hierarchical Random Field (AHRF) formulates the image segmentation problem as a graph optimization problem. Handcrafted features are extracted, which are then classified using machine learning classifiers. More recently deep learning approaches have emerged and demonstrated superior performance for a wide range of image analysis tasks. FCN, U-Net and DeepLabV3 are Convolutional Neural Networks used for semantic segmentation. While in separate experiments each of these methods have shown promising results, no prior work has comprehensively and systematically compared the approaches on the same large wound image dataset, or more generally compared deep learning vs non-deep learning wound image segmentation approaches. In this paper, we compare the segmentation performance of AHRF and CNN approaches (FCN, U-Net, DeepLabV3) using various metrics including segmentation accuracy (dice score), inference time, amount of training data required and performance on diverse wound sizes and tissue types. Improvements possible using various image pre-and post-processing techniques are also explored. As access to adequate medical images/data is a common constraint, we explore the sensitivity of the approaches to the size of the wound dataset. We found that for small datasets (<300 images), AHRF is more accurate than U-Net but not as accurate as FCN and DeepLabV3. AHRF is also over 1000x slower. For larger datasets (>300 images), AHRF saturates quickly, and all CNN approaches (FCN, U-Net and DeepLabV3) are significantly more accurate than AHRF.
Goal: Chronic wounds affect 6.5 million Americans. Wound assessment via algorithmic analysis of smartphone images has emerged as a viable option for remote assessment. Methods: We comprehensively score wounds based on the clinicallyvalidated Photographic Wound Assessment Tool (PWAT), which comprehensively assesses clinically important ranges of eight wound attributes: Size, Depth, Necrotic Tissue Type, Necrotic Tissue Amount, Granulation Tissue type, Granulation Tissue Amount, Edges, Periulcer Skin Viability. We proposed a DenseNet Convolutional Neural Network (CNN) framework with patchbased context-preserving attention to assess the 8 PWAT attributes of four wound types: diabetic ulcers, pressure ulcers, vascular ulcers and surgical wounds. Results: In an evaluation on our dataset of 1639 wound images, our model estimated all 8 PWAT sub-scores with classification accuracies and F1 scores of over 80%. Conclusions: Our work is the first intelligent system that autonomously grades wounds comprehensively based on criteria in the PWAT rubric, alleviating the significant burden that manual wound grading imposes on wound care nurses.
Motivation: Infection (bacteria in the wound) and ischemia (insufficient blood supply) in Diabetic Foot Ulcers (DFUs) increase the risk of limb amputation. Goal: To develop an image-based DFU infection and ischemia detection system that uses deep learning. Methods: The DFU dataset was augmented using geometric and color image operations, after which binary infection and ischemia classification was done using the EfficientNet deep learning model and a comprehensive set of baselines. Results: The EfficientNets model achieved 99% accuracy in ischemia classification and 98% in infection classification, outperforming ResNet and Inception (87% accuracy) and Ensemble CNN, the prior state of the art (Classification accuracy of 90% for ischemia 73% for infection). EfficientNets also classified test images in a fraction (10% to 50%) of the time taken by baseline models. Conclusions: This work demonstrates that EfficientNets is a viable deep learning model for infection and ischemia classification.
Augment a small, imbalanced, wound dataset by using semi-supervised learning with a secondary dataset. Then utilize the augmented wound dataset for deep learning-based wound assessment.Methods: The clinically-validated Photographic Wound Assessment Tool (PWAT) scores eight wound attributes: Size, Depth, Necrotic Tissue Type, Necrotic Tissue Amount, Granulation Tissue type, Granulation Tissue Amount, Edges, Periulcer Skin Viability to comprehensively assess chronic wound images. A small corpus of 1639 wound images labeled with ground truth PWAT scores was used as reference. A Semi-Supervised learning and Progressive Multi-Granularity training mechanism were used to leverage a secondary corpus of 9870 unlabeled wound images. Wound scoring utilized the EfficientNet Convolutional Neural Network on the augmented wound corpus.Results: Our proposed Semi-Supervised PMG EfficientNet (SS-PMG-EfficientNet) approach estimated all 8 PWAT sub-scores with classification accuracies and F1 scores of about 90% on average, and outperformed a comprehensive list of baseline models and had a 7% improvement over the prior state-of-the-art (without data augmentation). We also demonstrate that synthetic wound image generation using Generative Adversarial Networks (GANs) did not improve wound assessment.Conclusions: Semi-supervised learning on unlabeled wound images in a secondary dataset achieved impressive performance for deep learning-based wound grading.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.