2020
DOI: 10.1371/journal.pone.0231304
|View full text |Cite
|
Sign up to set email alerts
|

Tsinghua facial expression database – A database of facial expressions in Chinese young and older women and men: Development and validation

Abstract: Perception of facial identity and emotional expressions is fundamental to social interactions. Recently, interest in age associated changes in the processing of faces has grown rapidly. Due to the lack of older faces stimuli, most previous age-comparative studies only used young faces stimuli, which might cause own-age advantage. None of the existing Eastern face stimuli databases contain face images of different age groups (e.g. older adult faces). In this study, a database that comprises images of 110 Chines… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
36
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 61 publications
(40 citation statements)
references
References 46 publications
1
36
0
Order By: Relevance
“…Previous studies have demonstrated that facial recognition is sensitive to differences in the race (Ekman and Friesen, 1971;Ekman et al, 1987;Ng and Lindsay, 1994); therefore, to exclude cross-race influences on facial expression recognition, facial emotional expression images were selected from the Tsinghua facial expression database, which is a database of facial expressions posed by young and older Chinese women and men (Yang et al, 2020). In this study, we selected happy, neutral, and sad expressions with correct categorization rates of 97.77, 84.97, and 76.41%, respectively (Yang et al, 2020). Since the algorithm (Kim et al, 2019) used for real-to-cartoon face conversion in this study was established using female images, all the facial expression images selected from the database were images of young female actors (20 in total).…”
Section: Stimulimentioning
confidence: 99%
“…Previous studies have demonstrated that facial recognition is sensitive to differences in the race (Ekman and Friesen, 1971;Ekman et al, 1987;Ng and Lindsay, 1994); therefore, to exclude cross-race influences on facial expression recognition, facial emotional expression images were selected from the Tsinghua facial expression database, which is a database of facial expressions posed by young and older Chinese women and men (Yang et al, 2020). In this study, we selected happy, neutral, and sad expressions with correct categorization rates of 97.77, 84.97, and 76.41%, respectively (Yang et al, 2020). Since the algorithm (Kim et al, 2019) used for real-to-cartoon face conversion in this study was established using female images, all the facial expression images selected from the database were images of young female actors (20 in total).…”
Section: Stimulimentioning
confidence: 99%
“…All stimuli can be seen on the OSF ( https://osf.io/xsfg8/ ). Original images were drawn from Tsinghua facial expression database ( Yang et al, 2020 ) with permission.…”
Section: Introductionmentioning
confidence: 99%
“…Similar criteria were used by other dataset creators (e.g. Yang et al, 2020;Ebner, Riediger, & Lindenberger, 2010). At the second phase of ratings, the additional dimensions of attractiveness, valence, and genuineness were added to the survey, which was then taken by 11 raters who each rated all 248 remaining images.…”
Section: Database Validationmentioning
confidence: 99%
“…While there is an agreement that facial movements are informative for inferring emotions, the existence of a precise mapping between configurations of facial movements and emotion categories generalizable to all cultures, remains strongly debated (Nelson & Russel, 2013;Duran & Fernández-Dols, 2018;Barret et al, 2019). Most of the existing emotional face stimulus sets are developed in Western societies and a few countries in East Asia (The EU-emotion stimulus set (O'Reilly et al 2016), Tsinghua facial expression database (Yang et al, 2020),…”
Section: Introductionmentioning
confidence: 99%