In recent years, statistically motivated approaches for the registration and tracking of non-rigid objects, such as the Active Appearance Model (AAM), have become very popular. A major drawback of these approaches is that they require manual annotation of all training images which can be tedious and error prone. In this paper, a MPEG-4 based approach for the automatic annotation of frontal face images, having any arbitrary facial expression, from a single annotated frontal image is presented. This approach utilises the MPEG-4 based facial animation system to generate virtual images having different expressions and uses the existing AAM framework to automatically annotate unseen images. The approach demonstrates an excellent generalisability by automatically annotating face images from two different databases.
A neuro-physiologically inspired model is presented for the contrast enhancement of images. The contrast of an image is calculated using simulated on-and off-centre receptive fields whereby obtaining the corresponding two contrast maps. We propose an adaptive asymmetric gain control function that is applied to the two contrast maps which are then used to reconstruct the image resulting in its contrast enhancement. The image's mean luminance can be adjusted as desired by adjusting the asymmetricity between the gain control factors of the two maps. The model performs local contrast enhancement in the contrast domain of an image where it lends itself very naturally to such adjustments. Furthermore, the model is extended on to colour images using the concept of colour-opponent receptive fields found in the human visual system. The colour model enhances the contrast right in the colour space without extracting the luminance information from it. Being neurophysiologically plausible, this model can be beneficial in theorising and understanding the gain control mechanisms in the primate visual system. We compare our results with the CLAHE algorithm.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.