2013 IEEE International Conference on Robotics and Automation 2013
DOI: 10.1109/icra.2013.6630572
|View full text |Cite
|
Sign up to set email alerts
|

Facial communicative signal interpretation in human-robot interaction by discriminative video subsequence selection

Abstract: Abstract-Facial communicative signals (FCSs) such as head gestures, eye gaze, and facial expressions can provide useful feedback in conversations between people and also in humanrobot interaction. This paper presents a pattern recognition approach for the interpretation of FCSs in terms of valence, based on the selection of discriminative subsequences in video data. These subsequences capture important temporal dynamics and are used as prototypical reference subsequences in a classification procedure based on … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 46 publications
0
1
0
Order By: Relevance
“…Active appearance models individualized for each participant enabled the development of highly discriminant feature vectors from subsegments of each video sequence, which were cross-validated and then tested. Accuracy of the system in recognizing 'success' versus 'failure' displays, despite considerable individual and intertrial variation, equaled average human recognition performance [90].…”
Section: Future Directionsmentioning
confidence: 91%
“…Active appearance models individualized for each participant enabled the development of highly discriminant feature vectors from subsegments of each video sequence, which were cross-validated and then tested. Accuracy of the system in recognizing 'success' versus 'failure' displays, despite considerable individual and intertrial variation, equaled average human recognition performance [90].…”
Section: Future Directionsmentioning
confidence: 91%