2019
DOI: 10.1109/access.2018.2889852
|View full text |Cite
|
Sign up to set email alerts
|

Facial Action Units-Based Image Retrieval for Facial Expression Recognition

Abstract: Facial expression recognition (FER) is a very challenging problem in computer vision. Although extensive research has been conducted to improve FER performance in recent years, there is still room for improvement. A common goal of FER is to classify a given face image into one of seven emotion categories: angry, disgust, fear, happy, neutral, sad, and surprise. In this paper, we propose to use a simple multi-layer perceptron (MLP) classifier that determines whether the current classification result is reliable… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
16
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 33 publications
(23 citation statements)
references
References 47 publications
0
16
0
Order By: Relevance
“…All of these selected AUs are part of the standard set used in most facial expression recognition (FER) systems based on FACS [44,45]. We excluded AU41 (lid droop), AU42 (slit), and AU46 (wink) from the upper part of the face because these AUs were not coded with intensity levels in the last FACS revision [43], and a binary classification (AU present/absent) in a context where all other AUs were evaluated based on a level from O to E could negatively impact the accuracy of the system.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…All of these selected AUs are part of the standard set used in most facial expression recognition (FER) systems based on FACS [44,45]. We excluded AU41 (lid droop), AU42 (slit), and AU46 (wink) from the upper part of the face because these AUs were not coded with intensity levels in the last FACS revision [43], and a binary classification (AU present/absent) in a context where all other AUs were evaluated based on a level from O to E could negatively impact the accuracy of the system.…”
Section: Methodsmentioning
confidence: 99%
“…Therefore, we used 31 AUs that could be classified fast and with high accuracy: For the remaining AUs, more complex methods would have been required, which would also have required more processing time, so they were not evaluated in this study, since we aimed to build a system able to predict the DASS levels in real time. Another reason for which we selected this set of AUs was that they are present in several large databases employed for testing FER systems [44,45], such as Cohn-Kanade extended (CK+) and MMI, which we used in this study for evaluating the accuracy of the AU classification task. All AUs analyzed were treated individually, as nonadditive.…”
Section: Methodsmentioning
confidence: 99%
“…In [37], the AU-based image retrieval method was used to get supplementary information on the classification result of the given test image. Specifically, the retrieved image as well as the original test image participate in making final classification.…”
Section: Related Workmentioning
confidence: 99%
“…In this sub-section, we introduce an AU-based image retrieval method to find images with the similar emotion, not with the similar face, to increase the number of images in the minority classes. The AU-based image retrieval approach has been already used in [37], where the AU-based image retrieval was used in the testing stage to provide additional test images to increase the reliability on the classification score. In this paper, however, the AU-based image retrieval is adopted in the training stage to find the images with the similar emotion to the query image.…”
Section: A Image Oversampling By Au-based Retrievalmentioning
confidence: 99%
See 1 more Smart Citation