Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction 2020
DOI: 10.1145/3371382.3378319
|View full text |Cite
|
Sign up to set email alerts
|

Perception of Emotional Gait-like Motion of Mobile Humanoid Robot Using Vertical Oscillation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
5

Relationship

5
0

Authors

Journals

citations
Cited by 8 publications
(11 citation statements)
references
References 20 publications
0
11
0
Order By: Relevance
“…Citation information: DOI 10.1109/ACCESS.2021.3110160, IEEE Access TABLE 5. Summary of emotional expression recognition by walking motions of robots in previous researches: this study (Human and ibuki), the previous ibuki'20 study by [28], Wabian-2R [12]) and the original CG by [29], and Nao by [13] and the original CG by [7]. watched the emotional gait-induced upper body motions.…”
Section: Discussionmentioning
confidence: 94%
See 2 more Smart Citations
“…Citation information: DOI 10.1109/ACCESS.2021.3110160, IEEE Access TABLE 5. Summary of emotional expression recognition by walking motions of robots in previous researches: this study (Human and ibuki), the previous ibuki'20 study by [28], Wabian-2R [12]) and the original CG by [29], and Nao by [13] and the original CG by [7]. watched the emotional gait-induced upper body motions.…”
Section: Discussionmentioning
confidence: 94%
“…For reference, we summarize our results and previous research of emotional expression of robot walking motion here. Table 5 shows the robot height, the survey scale, the number of question answer options, and recognition rates of emotional expression of robot walking in previous research: ibuki'20 [28], Wabian-2R [12], CG of Wabian-2R [29], Nao [13], and CG of human animation [7].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…This study only focused on grip timing and duration to convey emotions in human-robot touch interaction. However, such other modalities in verbal/non-verbal behaviors as facial expressions [1][2][3][4], whole-body gestures [5,6], and voice characteristics like tone or pitch [7,8] are also critical in real settings. Mixing different modalities might increase the perceived empathy, but heeding appropriate behavior designs of each modality is needed to avoid mismatch expressions.…”
Section: B Different Modalities and Touch Characteristicsmentioning
confidence: 99%
“…xpression of emotional behaviors is an essential capability for social robots that interact with people in daily settings. Many robotics researchers have focused on developing robot hardware with enough degrees of freedoms (DOFs) to express various emotions by facial expressions [1][2][3][4], whole-body motions [5,6], and voice characteristics [7,8] as well as models that express appropriate emotions in interaction contexts [9,10]. These studies mainly used humanoid robots because human-like appearances and modalities are easily understandable for humans, but emotion expression models are not limited to such human-like robots.…”
Section: Introductionmentioning
confidence: 99%