Background: Accurate skin colour measurements are important for numerous medical applications including the diagnosis and treatment of cutaneous disorders and the provision of maxillofacial soft tissue prostheses. Methods: In this study, we obtained accurate skin colour measurements from four different ethnic groups (Caucasian, Chinese, Kurdish, Thai) and at four different body locations (Forehead, cheek, inner arm, back of hand) with a view of establishing a new skin colour database for medical and cosmetic applications. Skin colours are measured using a spectrophotometer and converted to a device-independent standard colour appearance space (CIELAB) where skin colour is expressed as values along the three dimensions: Lightness L*, Redness a* and Yellowness b*. Skin colour differences and variation are then evaluated as a function of ethnicity and body location. Results: We report three main results: (1) When plotted in a standard colour appearance space (CIELAB), skin colour distributions for the four ethnic groups overlap significantly, although there are systematic mean differences. Between ethnicities, the most significant skin colour differences occur along the yellowness dimension, with Thai skin exhibiting the highest yellowness (b*) value and Caucasian skin the lowest value. Facial redness (a*) is invariant across the four ethnic groups.
A method to reconstruct spectral reflectance from RGB images is presented without priori knowledge of camera's spectral responsivity. To obtain the spectral reflectance of a pixel or region in images, this method assumes that reflectance is a weighted average of reflectances of samples in a selected training group, in which all samples have smaller color difference with that pixel or region. Four proposed weighting modes with different selected numbers of training samples were investigated. Among them, the inverse square weighting mode obtains the best performance, and it is not very sensitive to the selected training samples number. Experimental results show that all weighting modes outperform the traditional method in terms of root mean squared error and Goodness‐of‐Fit Coefficient between the actual and the reconstructed reflectances as well as color differences under the other light condition. © 2016 Wiley Periodicals, Inc. Col Res Appl, 42, 327–332, 2017
The light-emitting diode (LED)-based light sources have been widely applied across numerous industries and in everyday practical uses. Recently, the LED-based light source consisting of red, green and blue LEDs with narrow spectral bands (RGB-LED) has been a more preferred illumination source than the common white phosphor LED and other traditional broadband light sources because the RGB-LED can create many types of illumination color. The color rendering index of the RGB-LED, however, is considerably lower compared to the traditional broadband light sources and the multi-band LED light source (MB-LED), which is composed of several LEDs and can accurately simulate daylight illuminants. Considering 3 relatively narrow spectral bands of the RGB-LED light source, the color constancy, which is referred to as the ability of the human visual system to attenuate influences of illumination color change and hold the perception of a surface color constant, may be worse under the RGB-LED light source than under the traditional broadband light sources or under the MB-LED. In this study, we investigated categorical color constancy using a color naming method with real Munsell color chips under illumination changes from neutral to red, green, blue, and yellow illuminations. The neutral and 4 chromatic illuminants were produced by the RGB-LED light source. A modified use of the color constancy index, which describes a centroid shift of each color category, was introduced to evaluate the color constancy performance. The results revealed that categorical color constancy under the 4 chromatic illuminants held relatively well, except for the red, brown, orange, and yellow color categories under the blue illumination and the orange color category under the yellow illumination. Furthermore, the categorical color constancy under red and green illuminations was better than the categorical color constancy under blue and yellow illuminations. The results indicate that a color constancy mechanism in the visual system functions in color categories when the illuminant emits an insufficient spectrum to render the colors of reflecting surfaces accurately. However, it is not recommended to use the RGB-LED light source to produce blue and yellow illuminations because of the poor color constancy.
Underwater spectral imaging is a promising method for mapping, classification and health monitoring of coral reefs and seafloor inhabitants. However, the spectrum of light is distorted during the underwater imaging process due to wavelength-dependent attenuation by the water. This paper presents a model-based method that accurately restores brightness of underwater spectral images captured with narrowband filters. A model is built for narrowband underwater spectral imaging. The model structure is derived from physical principles, representing the absorption, scattering and refraction by water and the optical properties of narrowband filters, lenses and image sensors. The model coefficients are calibrated based on spectral images captured underwater and in air. With the imaging model available, energy loss due to water attenuation is restored for images captured at different underwater distances. An experimental setup is built and experiments are carried out to verify the proposed method. Underwater images captured within an underwater distance of 260 cm are restored and compared with those in air. Results show that the relative restoration error is 3.58% on average for the test images, thus proving the accuracy of the proposed method.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.