A Handbook of Process Tracing Methods 2019
DOI: 10.4324/9781315160559-14
|View full text |Cite
|
Sign up to set email alerts
|

A Practical Guide for Automated Facial Emotion Classification 1

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 1 publication
0
4
0
Order By: Relevance
“…This software uses algorithms to translate the movement of facial features, such as eyes, eye corners, brows, mouth corners, and nose tip, into classifications of emotional valence. Recent work suggests that this automated facial-expression analysis software performs well for detecting emotional states [ 52 , 53 ]. We inspected the aggregated data for the number of occurrences across all respondents, and for any positive, negative, or neutral emotional valence elicited by the visualization.…”
Section: Methodsmentioning
confidence: 99%
“…This software uses algorithms to translate the movement of facial features, such as eyes, eye corners, brows, mouth corners, and nose tip, into classifications of emotional valence. Recent work suggests that this automated facial-expression analysis software performs well for detecting emotional states [ 52 , 53 ]. We inspected the aggregated data for the number of occurrences across all respondents, and for any positive, negative, or neutral emotional valence elicited by the visualization.…”
Section: Methodsmentioning
confidence: 99%
“…This software uses algorithms to translate the movement of facial features such as eyes, and eye corners, brows, mouth corners, and nose tip into classifications of emotional valence. Recent work suggests that this automated facial-expression analysis software performs well for detecting emotional states [48,49]. We inspected the aggregated data for the number of occurrences across all respondents, and for any positive, negative or neutral emotional valence elicited by the visualization.…”
Section: Discussionmentioning
confidence: 99%
“…Instead, it employs a neural network to analyze patterns of wrinkles and crevices that are created by the different action units—an approach that is superior to facial-point-based architectures (iMotions, 2018). In consequence, Emotient FACET provides enhanced accuracy compared to human coders and other algorithms (Stockli et al, 2019). 2…”
Section: Methodsmentioning
confidence: 99%
“…These databases contain validated pictures where trained subjects were photographed displaying facial expressions of certain emotions. Emotient FACET was found to identify facial expressions of emotions from pictures in these datasets with 96% accuracy, which is superior compared to other popular algorithms such as FaceReader (88%) and Affdex (68%; Stockli et al, 2019). Additional validation studies have highlighted the ability of FACET to correctly classify emotions, illustrating strong agreement between FACET and human raters (e.g., Calvo et al, 2018; Krumhuber et al, 2020).…”
mentioning
confidence: 88%