Proceedings of the 6th International Conference on Human-Robot Interaction 2011
DOI: 10.1145/1957656.1957781
|View full text |Cite
|
Sign up to set email alerts
|

Automatic analysis of affective postures and body motion to detect engagement with a game companion

Abstract: The design of an affect recognition system for socially perceptive robots relies on representative data: human-robot interaction in naturalistic settings requires an affect recognition system to be trained and validated with contextualised affective expressions, that is, expressions that emerge in the same interaction scenario of the target application. In this paper we propose an initial computational model to automatically analyse human postures and body motion to detect engagement of children playing chess … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
136
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
4
3
3

Relationship

0
10

Authors

Journals

citations
Cited by 204 publications
(136 citation statements)
references
References 31 publications
0
136
0
Order By: Relevance
“…If correct, this explanation underscores the challenges inherent in scaling and generalizing sensor-based affect detectors. One direction for addressing these sources of variation in the future is to investigate alternate positioning for the Kinect sensors, including use of side angles (Sanghvi et al 2011). Additional directions include engineering predictor features using standard units rather than Cartesian space, as well features that account for individual differences in body shape and size (Worsley et al 2015).…”
Section: Discussion Of Initial Affect Detector Modeling Resultsmentioning
confidence: 99%
“…If correct, this explanation underscores the challenges inherent in scaling and generalizing sensor-based affect detectors. One direction for addressing these sources of variation in the future is to investigate alternate positioning for the Kinect sensors, including use of side angles (Sanghvi et al 2011). Additional directions include engineering predictor features using standard units rather than Cartesian space, as well features that account for individual differences in body shape and size (Worsley et al 2015).…”
Section: Discussion Of Initial Affect Detector Modeling Resultsmentioning
confidence: 99%
“…For example, Poggi formulates engagement as: " e value that a participant in an interaction attributes to the goal of being together with the other participant(s) and of continuing the interaction" [37]. is de nition has been used in several works [6,13,20,35,39]. Based on this de nition we added continuous annotations of engagement and then provided discrete instances when a change occurs ranging from strongly disengaged to strongly engaged.…”
Section: Multimodal Signals Of Engagementmentioning
confidence: 99%
“…a pressure sensing chairs [6], [7], or generic inputs e.g. video [8] or Kinect [9], [10]. The number of cognitive and affective states being inferred from observable posture continues to increase [9], [10]: frustration, involvement, endurability, engagement or attention.…”
Section: Related Workmentioning
confidence: 99%