2015
DOI: 10.3389/fpsyg.2015.00931
|View full text |Cite
|
Sign up to set email alerts
|

Systematic analysis of video data from different human–robot interaction studies: a categorization of social signals during error situations

Abstract: Human–robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human–robot interaction experiments. For that, we analyzed 201 videos of five human–robot interaction user studies with varying tasks from four independent projects. The analysis shows that there are two ty… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
44
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 56 publications
(53 citation statements)
references
References 27 publications
3
44
0
Order By: Relevance
“…Therefore, recognizing social signals might help a robot to understand that an error happened. According to the frequencies of occurrence, gaze shifts and smile/ laughter carry most potential for error detection, which is in line with our previous findings in the study of Giuliani et al (2015). Upon a detailed analysis on the categories of social signals we found that people make significantly more gaze shifts during technical failures.…”
Section: Discussionsupporting
confidence: 90%
See 2 more Smart Citations
“…Therefore, recognizing social signals might help a robot to understand that an error happened. According to the frequencies of occurrence, gaze shifts and smile/ laughter carry most potential for error detection, which is in line with our previous findings in the study of Giuliani et al (2015). Upon a detailed analysis on the categories of social signals we found that people make significantly more gaze shifts during technical failures.…”
Section: Discussionsupporting
confidence: 90%
“…To base the user study on the previous findings from Giuliani et al (2015) and Mirnig et al (2015), we programmed the robot to commit two social norm violations and two technical failures in each session. Based on our previous research, we defined these two types of error as the typical mistakes robots make in HRI.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Ross (Ross et al, 2004 ) categorized system errors according to failure recoverability, defining anticipated errors (when the agent backtracks through the plan to achieve the same goal through an alternate course of action), exceptional errors (when the current plan cannot cope with the failure, and re-planning can be done to formulate a strategy to achieve the original goal) , unrecoverable errors (when the current plan cannot cope with the error and re-planning cannot be done), and socially recoverable errors (when the agent can continue on with the original plan with appropriate assistance from other agents within its environment). Giuliani et al ( 2015 ) classified failures according to their type, defining technical failures (caused by technical shortcomings of the robot) and social norm violations (when the robot deviates from the social script or uses inappropriate social signals, e.g., looking away from a person while talking to them).…”
Section: Defining and Classifying Errorsmentioning
confidence: 99%
“…Nevertheless it is not surprising that human robot interactions might fail when at times even human-human interactions do. Giuliani et al [38], described two types of failures in HRI, i.e. social norm violations and technical failures.…”
Section: Social and Situation Awarenessmentioning
confidence: 99%