In this paper, we investigate feature extraction and feature selection methods as well as classification methods for automatic facial expression recognition (FER) system. The FER system is fully automatic and consists of the following modules: face detection, facial detection, feature extraction, selection of optimal features, and classification. Face detection is based on AdaBoost algorithm and is followed by the extraction of frame with the maximum intensity of emotion using the inter-frame mutual information criterion. The selected frames are then processed to generate characteristic features using different methods including: Gabor filters, log Gabor filter, local binary pattern (LBP) operator, higher-order local autocorrelation (HLAC) and a recent proposed method called HLAC-like features (HLACLF). The most informative features are selected based on both wrapper and filter feature selection methods. Experiments on several facial expression databases show comparisons of different methods.
This paper introduces a tensor perceptual color framework (TPCF) for facial expression recognition (FER), which is based on information contained in color facial images. The TPCF enables multi-linear image analysis in different color spaces and demonstrates that color components provide additional information for robust FER. Using this framework, the components (in either RGB, YCbCr, CIELab or CIELuv space) of color images are unfolded to two-dimensional (2- D) tensors based on multi-linear algebra and tensor concepts, from which the features are extracted by Log-Gabor filters. The mutual information quotient (MIQ) method is employed for feature selection. These features are classified using a multi-class linear discriminant analysis (LDA) classifier. The effectiveness of color information on FER using low-resolution and facial expression images with illumination variations is assessed for performance evaluation. Experimental results demonstrate that color information has significant potential to improve emotion recognition performance due to the complementary characteristics of image textures. Furthermore, the perceptual color spaces (CIELab and CIELuv) are better overall for facial expression recognition than other color spaces by providing more efficient and robust performance for facial expression recognition using facial images with illumination variation.
This study proposes an automatic dorsal hand vein verification system using a novel algorithm called biometric graph matching (BGM). The dorsal hand vein image is segmented using the K-means technique and the region of interest is extracted based on the morphological analysis operators and normalised using adaptive histogram equalisation. Veins are extracted using a maximum curvature algorithm. The locations and vascular connections between crossovers, bifurcations and terminations in a hand vein pattern define a hand vein graph. The matching performance of BGM for hand vein graphs is tested with two cost functions and compared with the matching performance of two standard point patterns matching algorithms, iterative closest point (ICP) and modified Hausdorff distance. Experiments are conducted on two public databases captured using far infrared and near infrared (NIR) cameras. BGM's matching performance is competitive with state-of-the-art algorithms on the databases despite using small and concise templates. For both databases, BGM performed at least as well as ICP. For the small sized graphs from the NIR database, BGM significantly outperformed point pattern matching. The size of the common subgraph of a pair of graphs is the most significant discriminating measure between genuine and imposter comparisons.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.