Robot Vision
DOI: 10.1007/978-3-540-78157-8_11
|View full text |Cite
|
Sign up to set email alerts
|

Facial Expression Recognition for Human-Robot Interaction – A Prototype

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
22
0

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(23 citation statements)
references
References 24 publications
1
22
0
Order By: Relevance
“…A number of HRI studies have incorporated the use of affect classification using categorical emotional models for facial expressions [57][58][59][60], body language [61][62][63][64], voice [7,19,[65][66][67][68][69], physiological signals [70][71][72], and multi-modal systems [73,74]. These models allow robots to interpret affective states in a similar manner as humans [75].…”
Section: Affect Models Used In Hrimentioning
confidence: 99%
“…A number of HRI studies have incorporated the use of affect classification using categorical emotional models for facial expressions [57][58][59][60], body language [61][62][63][64], voice [7,19,[65][66][67][68][69], physiological signals [70][71][72], and multi-modal systems [73,74]. These models allow robots to interpret affective states in a similar manner as humans [75].…”
Section: Affect Models Used In Hrimentioning
confidence: 99%
“…For example, it could provide the user's intentions and feelings to a machine or robot to enable it to respond more appropriately during the service (Essa & Pentland, 1997). In the application of robot-assisted learning whereby a robot is used to teach the user by explaining the content of the lesson and question the user afterwards, understanding the human emotion will enable the robot to progress from one lesson to the next when the user is ready (Wimmer et al, 2008). Video conferencing, tele-presence and tele-teaching require transmission of a large amount of data, and data compression is often needed in order to reduce the storage and bandwidth requirements.…”
Section: Applicationsmentioning
confidence: 99%
“…The extracted features should represent different types of facial expressions in a way which is not significantly affected by age, gender, or ethnic origin of the subject. The classification method must be capable to define appropriate rules in order to derive a specific type of facial expression from the facial features provided, even when the output from the preceding processing stages, such as facial data acquisition and facial feature extraction, is noisy or incomplete (Wimmer et al, 2008).…”
Section: Challengesmentioning
confidence: 99%
“…However, almost all methods for HRI are developed for controlling robots, not for interacting with them. For natural communications between robots and humans, it is necessary for the robot to be able to respond according to the user's emotional state [1]. In general, the user may feel more satisfied with and friendly towards the robot when it responds in parallel with the user's emotion.…”
Section: Introductionmentioning
confidence: 99%
“…In general, the user may feel more satisfied with and friendly towards the robot when it responds in parallel with the user's emotion. Facial expression is one of the most important means of interaction in using emotions [1,2,3]. To recognize the facial expressions of the input face, it is important for the robot to get an accurate reading of the positions of facial feature points, such as the nose, mouse, eyes and eyebrows, since the locations of the facial feature points are largely varied according to the facial expression.…”
Section: Introductionmentioning
confidence: 99%