Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts 2015
DOI: 10.1145/2701973.2702053
|View full text |Cite
|
Sign up to set email alerts
|

Facial Expression Synthesis on Robots

Abstract: We present a generalized technique for easily synthesizing facial expressions on robotic faces. In contrast to other work, our approach works in near real time with a high level of accuracy, does not require any manual labeling, is a fully open-source ROS module, and can enable the research community to perform objective and systematic comparisons between the expressive capabilities of different robots.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 6 publications
0
6
0
Order By: Relevance
“…Interestingly, the human accuracy was roughly 65% on FER-2013, according to Goodfellow et al (2013) , which is about 9% lower than RMN. Due to various factors such as location, perceived gender, and age affecting people’s subjective judgment ( Moosaei and Riek, 2013 ), using an ML-based expression classifier may provide comparable accuracies while being more efficient in terms of time and resources. People should therefore be included in the evaluation stage when the robot is sufficiently developed and can be applied in the context it was created for.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Interestingly, the human accuracy was roughly 65% on FER-2013, according to Goodfellow et al (2013) , which is about 9% lower than RMN. Due to various factors such as location, perceived gender, and age affecting people’s subjective judgment ( Moosaei and Riek, 2013 ), using an ML-based expression classifier may provide comparable accuracies while being more efficient in terms of time and resources. People should therefore be included in the evaluation stage when the robot is sufficiently developed and can be applied in the context it was created for.…”
Section: Discussionmentioning
confidence: 99%
“…The development has been highly iterative, and thus several prototypes have been built to answer technical questions, as well as being a manifestation of the idea that can be presented to users to gain important feedback ( Auflem et al, 2019 ). However, the challenge of the subjective matter of facial cues and expressions is obtaining actionable and objective data to measure the performance of prototypes by multimodal evaluation ( Moosaei and Riek, 2013 ; Ege et al, 2020 ). This is particularly challenging when evaluating expressive robots due to the resolution and fidelity of the presented prototype being perceived differently, especially when users are unaware of the current state of development.…”
Section: Introductionmentioning
confidence: 99%
“…The human face is a key expressive modality for communicating with others and understanding their intentions and expressions. Facial expressions are a form of visual communication that help to enhance other modalities of communication, such as spoken or gestural language, and enable people to spontaneously communicate important information [60,133]. In clinical settings, healthcare workers use other non-verbal cues to infer patient physiological states, such as pallor, blinking, eye gaze, blushing, and sweating.…”
Section: The Face As a Communication Modality For Robots And Virtual ...mentioning
confidence: 99%
“…The contributions of this work are as follows. First, we extended an FEA system previously developed by our team [133] to improve automatic FACS ratings of facial AUs. The extended FEA system benefits from preprocessing techniques such as noise reduction and facial alignment techniques to diminish the effects of facial deformations, including translation, rotation, and distance to the camera.…”
Section: End-to-end Analysis-modeling-synthesis Framework Developmentmentioning
confidence: 99%
See 1 more Smart Citation