2023
DOI: 10.3389/frobt.2023.1271610
|View full text |Cite
|
Sign up to set email alerts
|

Real-time emotion generation in human-robot dialogue using large language models

Chinmaya Mishra,
Rinus Verdonschot,
Peter Hagoort
et al.

Abstract: Affective behaviors enable social robots to not only establish better connections with humans but also serve as a tool for the robots to express their internal states. It has been well established that emotions are important to signal understanding in Human-Robot Interaction (HRI). This work aims to harness the power of Large Language Models (LLM) and proposes an approach to control the affective behavior of robots. By interpreting emotion appraisal as an Emotion Recognition in Conversation (ERC) tasks, we use… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2
2
1

Relationship

1
4

Authors

Journals

citations
Cited by 10 publications
(9 citation statements)
references
References 42 publications
0
8
0
Order By: Relevance
“…Few researchers have started addressing this gap. For example, in a recent study [36] examined the relationship between robot and human gaze behavior. The study involved a within-subjects design where 33 participants interacted with a Furhat robot in two experimental conditions: Fixed Gaze and Gaze Aversion.…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…Few researchers have started addressing this gap. For example, in a recent study [36] examined the relationship between robot and human gaze behavior. The study involved a within-subjects design where 33 participants interacted with a Furhat robot in two experimental conditions: Fixed Gaze and Gaze Aversion.…”
Section: Related Workmentioning
confidence: 99%
“…If robots do not exhibit gaze aversions, then users may have to put in extra effort to avoid frequent mutual gaze with the robot, which can make the interaction more difficult. In subsequent work, [35] utilized data collected in [36] and explored the relationship between gaze and speech entrainment. PRAAT toolkit was employed to extract mean pitch values of the participants' and robots' speech at each turn exchange.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations