2016
DOI: 10.1145/2882970
|View full text |Cite
|
Sign up to set email alerts
|

See You See Me

Abstract: We focus on a fundamental looking behavior in human-robot interactions – gazing at each other’s face. Eye contact and mutual gaze between two social partners are critical in smooth human-human interactions. Therefore, investigating at what moments and in what ways a robot should look at a human user’s face as a response to the human’s gaze behavior is an important topic. Toward this goal, we developed a gaze-contingent human-robot interaction system, which relied on momentary gaze behaviors from a human user t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 35 publications
(10 citation statements)
references
References 59 publications
0
9
0
Order By: Relevance
“…Prasov and Chai [ 31 ] developed a system that combines speech and eye gaze to enhance reference resolution in conversational interfaces. Xu et al [ 32 ] investigated the role of mutual gaze in a human-robot collaboration setting and found that maintaining eye contact leads to improved multimodal interaction behavior of users, i.e., more synchronized and coordinated. Baur et al [ 33 ] implemented NovA, a system for analyzing and interpreting social signals in multi-modal interactions with a conversational agent, which integrates eye tracking technology.…”
Section: Related Workmentioning
confidence: 99%
“…Prasov and Chai [ 31 ] developed a system that combines speech and eye gaze to enhance reference resolution in conversational interfaces. Xu et al [ 32 ] investigated the role of mutual gaze in a human-robot collaboration setting and found that maintaining eye contact leads to improved multimodal interaction behavior of users, i.e., more synchronized and coordinated. Baur et al [ 33 ] implemented NovA, a system for analyzing and interpreting social signals in multi-modal interactions with a conversational agent, which integrates eye tracking technology.…”
Section: Related Workmentioning
confidence: 99%
“…The issue of conversation with Pepper includes expressions such as robot gaze [34], eye blink synchrony [35], eye contact [36], and speech [37]. This issue between older persons and Pepper with a conversation function in the application named "Kenkou-oukoku TALK for Pepper [38]" is vocalization with less intonation.…”
Section: An Example Of Conversation With Older Persons and Peppermentioning
confidence: 99%
“…The behaviors of agents, both human and robot, were found to affect each other in real-time. Switching robot attention between a human partner's face and object in a joint-attention task, fostered human-robot coordination (Xu et al, 2016). Others have similarly studied how ToM attributions of robots will influence HRI.…”
Section: Theory Of Mindmentioning
confidence: 99%
“…When the robot had human-like characteristics, gaze cues guided human responses, suggesting that intentions were being inferred, thus improving HRI. Research has also examined settings where humans and robots participated in joint-attention tasks (Xu, Zhang, & Yu, 2016). The behaviors of agents, both human and robot, were found to affect each other in real-time.…”
Section: Theory Of Mindmentioning
confidence: 99%