Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction 2018
DOI: 10.1145/3171221.3171261
|View full text |Cite
|
Sign up to set email alerts
|

Multimodal Expression of Artificial Emotion in Social Robots Using Color, Motion and Sound

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
56
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 85 publications
(63 citation statements)
references
References 34 publications
2
56
0
Order By: Relevance
“…A few recent studies have also investigated the design of non-linguistic sounds in HRI [30,27,10]. However, the amount of information that can be communicated by the nonlinguistic utterances is substantially limited, and according to Crumpton and Bethel, "there has been little research into how manipulating a robots voice would affect its users" [11].…”
Section: Related Workmentioning
confidence: 99%
“…A few recent studies have also investigated the design of non-linguistic sounds in HRI [30,27,10]. However, the amount of information that can be communicated by the nonlinguistic utterances is substantially limited, and according to Crumpton and Bethel, "there has been little research into how manipulating a robots voice would affect its users" [11].…”
Section: Related Workmentioning
confidence: 99%
“…More research is needed to assess how other types of behavioral feedback impact the training interaction and how to correctly implement that demeanor into current machine learning techniques. Notwithstanding, research with non-humanoid robots that communicate with nonverbal cues like light [55,56], sound [57,58] or both [37,59,60] is an ongoing area of study in the field of Social Robotics. The next step of this research aims to identify and design light cues and audio utterances that can improve the experience of training robots.…”
Section: Discussionmentioning
confidence: 99%
“…Geometrically shaped robots like the artificial assistant Jibo [34], the robot playmate Cozmo [35], or Mira [36], a robot designed with Pixar's animation style, were developed to use fluid movements and animations to give an organic and emotional impression to the user. Movement, sound, and color are the main outputs used to express emotional feedback in these types of robots [37].…”
Section: Emotional Robotsmentioning
confidence: 99%
“…They found that people could identify the correct emotions corresponding to each role (correctness determined by a pretest without the robot) and the drone's intent (e.g., take photo) better than chance. Löffler et al [26] also used aesthetics to design fast rotations and circular motion in a ground robot to express joy, slow rotations away from the user for sadness, jumpy movements away from the user for fear, and shaking movements toward the user for anger. They found that fear is best communicated by motion, joy is best communicated by sound plus motion, and other emotions are best communicated by color and sound.…”
Section: Design and Communicativeness Of Nonanthropomorphic Robot Motionmentioning
confidence: 99%