Abstract. In this paper, a burn color image segmentation and classification system is proposed. The aim of the system is to separate burn wounds from healthy skin, and to distinguish among the different types of burns (burn depths). Digital color photographs are used as inputs to the system. The system is based on color and texture information, since these are the characteristics observed by physicians in order to form a diagnosis. A perceptually uniform color space (L*u*v*) was used, since Euclidean distances calculated in this space correspond to perceptual color differences. After the burn is segmented, a set of color and texture features is calculated that serves as the input to a Fuzzy-ARTMAP neural network. The neural network classifies burns into three types of burn depths: superficial dermal, deep dermal, and full thickness. Clinical effectiveness of the method was demonstrated on 62 clinical burn wound images, yielding an average classification success rate of 82%.
In this paper a new system for burn diagnosis is proposed. The aim of the system is to separate burn wounds from healthy skin, and the different types of burns (burn depths) from each other, identifying each one. The system is based on the colour and texture information, as these are the characteristics observed by physicians in order to give a diagnosis. We use a perceptually uniform colour space ( L * u * v * ), since Euclidean distances calculated in this space correspond to perceptually colour differences. After the burn is segmented, some colour and texture descriptors are calculated and they are the inputs to a Fuzzy-ARTMAP neural network. The neural network classifies them into three types of burns: superficial dermal, deep dermal and full thickness. Clinical effectiveness of the method was demonstrated on 62 clinical burn wound images obtained from digital colour photographs, yielding an average classification success rate of 82 % compared to expert classified images.With the fast advances in technology, the Computer Aided Diagnosis (CAD) systems are getting more popular. However, nowadays, the research in the field of colour skin images is being developed slowly due to the difficulty of translating colour human perception into objective rules, analyzable by a computer. That is why automation of burn wound diagnosis is still an almost unexplored field. While there is hardly bibliography about burn depth determination by visual image analysis and processing [4] [5], one can find some research about the relationship between depth and superficial temperature [6], or other works trying to evaluate burn depth by using thermographic images [7], infrared and ultraviolet images [8], radioactive isotopes [9] and Doppler laser flux measurements [10]. These techniques have limitation not only in diagnosis accuracy but also in unallowable economical cost.Talking more generally about colour skin image processing, one can find two main applications in the literature [11]: the assessment of the healing of skin wounds or ulcers [12][13][14][15][16], and the diagnosis of pigmented skin lesions such as melanomas [17][18][19][20]. The analysis of lesions involves more traditional image processing techniques such as edge detection and object identification, followed by an analysis of the size, shape, irregularity and colour of the segmented lesion. However, in wound analysis, although it is necessary to detect the wound border and to calculate its area, analysis of the colours within the wound site is often more important. Particularly, in the case of burn depth determination, we are not going to focus on the shape of the burn, because it is irrelevant in order to predict its depth. The main characteristics for this purpose are the colour and texture information, as they are what physicians observed in order to give a diagnosis.The developed system consists of the following steps: 1. Image acquisition. We have developed a new protocol for standardizing the image acquisition [2] [3]. 2. Segmentation. Many segmentation algorit...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.