In this study, an image‐based measurement system was developed for human facial skin colour, involving the development of a digital imaging system, collection of facial skin colour from 60 human subjects, generation of different colour characterization models, and performance evaluation. The factors that affect facial skin colour characterization, including different training datasets (two colour charts and the collected facial skin colour dataset), mathematical mapping methods (linear transformation, polynomial regression, root‐polynomial regression and neural network) and camera image formats (JPG and RAW), were investigated and quantified not only by the conventional method of CIELAB colour difference, but also two newly introduced measures, facial colour contrast and skin colour gamut. The results indicate that the RAW image format for camera digital signals gave a more stable performance than the JPG format images, and the higher order polynomial regression with good predictive accuracy in terms of CIELAB colour difference did not perform well for the whole facial image. It is suggested to evaluate the model performance using the colour of both specific facial positions and the overall facial skin colour. Our comparative analysis in this study provides useful guidance for determining the colour characterization model for facial skin.
The current color-difference formulas were developed based on 2D samples and there is no standard guidance for the color-difference evaluation of 3D objects. The aim of this study was to test and optimize the CIELAB and CIEDE2000 color-difference formulas by using 42 pairs of 3D-printed spherical samples in Experiment I and 40 sample pairs in Experiment II. Fifteen human observers with normal color vision were invited to attend the visual experiments under simulated D65 illumination and assess the color differences of the 82 pairs of 3D spherical samples using the gray-scale method. The performances of the CIELAB and CIEDE2000 formulas were quantified by the STRESS index and F-test with respect to the collected visual results and three different optimization methods were performed on the original color-difference formulas by using the data from the 42 sample pairs in Experiment I. It was found that the optimum parametric factors for CIELAB were kL = 1.4 and kC = 1.9, whereas for CIEDE2000, kL = 1.5. The visual data of the 40 sample pairs in Experiment II were used to test the performance of the optimized formulas and the STRESS values obtained for CIELAB/CIEDE2000 were 32.8/32.9 for the original formulas and 25.3/25.4 for the optimized formulas. The F-test results indicated that a significant improvement was achieved using the proposed optimization of the parametric factors applied to both color-difference formulas for 3D-printed spherical samples.
An improved spectral reflectance estimation method was developed to transform captured RGB images to spectral reflectance. The novelty of our method is an iteratively reweighted regulated model that combines polynomial expansion signals, which was developed for spectral reflectance estimation, and a cross-polarized imaging system, which is used to eliminate glare and specular highlights. Two RGB images are captured under two illumination conditions. The method was tested using ColorChecker charts. The results demonstrate that the proposed method could make a significant improvement of the accuracy in both spectral and colorimetric: it can achieve 23.8% improved accuracy in mean CIEDE2000 color difference, while it achieves 24.6% improved accuracy in RMS error compared with classic regularized least squares (RLS) method. The proposed method is sufficiently accurate in predicting the spectral properties and their performance within an acceptable range, i.e., typical customer tolerance of less than 3 DE units in the graphic arts industry.
Fast track article for IS&T International Symposium on Electronic Imaging 2021: Color Imaging XXVI: Displaying, Processing, Hardcopy, and Applications proceedings.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.