2018 IEEE Conference on Computational Intelligence and Games (CIG) 2018
DOI: 10.1109/cig.2018.8490373
|View full text |Cite
|
Sign up to set email alerts
|

An Eye Gaze Model for Controlling the Display of Social Status in Believable Virtual Humans

Abstract: Designing highly believable characters remains a major concern within digital games. Matching a chosen personality and other dramatic qualities to displayed behavior is an important part of improving overall believability. Gaze is a critical component of social exchanges and serves to make characters engaging or aloof, as well as to establish character's role in a conversation.In this paper, we investigate the communication of status related social signals by means of a virtual human's eye gaze. We constructed… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 39 publications
0
3
0
Order By: Relevance
“…Beyond the more modest strategies of implementing at least some listening behavior, research has increasingly shifted focus to implementations of higher-order interactional features, such as social status-specific multimodal conduct [Nixon, DiPaola, Bernardet, 2018] and emotion monitoring, classifying and mirroring [Yalçın, 2020]. The general trend can be represented by a hierarchy (see fig.…”
Section: Specialized Subsystemsmentioning
confidence: 99%
“…Beyond the more modest strategies of implementing at least some listening behavior, research has increasingly shifted focus to implementations of higher-order interactional features, such as social status-specific multimodal conduct [Nixon, DiPaola, Bernardet, 2018] and emotion monitoring, classifying and mirroring [Yalçın, 2020]. The general trend can be represented by a hierarchy (see fig.…”
Section: Specialized Subsystemsmentioning
confidence: 99%
“…The behavior generation component allows for synchronization of reflective behavior such as shifting body posture, breathing gaze or self-touching behavior as responses to the events happening in the environment and idle movement patterns during the lack of external input increases the quality of the interaction (Bernardet, Kang, Feng, DiPaola, & Shapiro, 2017;Nixon, DiPaola, & Bernardet, 2018). The consistency and coordination of these movements allow generating a sense of personality in affective agents (Bernardet, Saberi, & DiPaola, 2016).…”
Section: Behavior Generationmentioning
confidence: 99%
“…The posture of the users, head gestures and hand gestures would provide valuable information about the personality of the user as well as the context and emotions of the user (DeVault et al, 2014). The personal space the user is keeping with the agent (Saberi, Bernardet, & DiPaola, 2015 as well as the gaze behavior (Nixon et al, 2018) can indicate to the social context and personality. Moreover, additional biosensing capabilities such as heart-beat, skin conductance, breathing patterns as well as EEG signals can open up new possibilities of how an interaction between an empathic agent could look like, by giving additional information on the user.…”
Section: New Implementation Platforms and Scenariosmentioning
confidence: 99%