2019
DOI: 10.48550/arxiv.1903.03217
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The Emotionally Intelligent Robot: Improving Social Navigation in Crowded Environments

Aniket Bera,
Tanmay Randhavane,
Rohan Prinja
et al.

Abstract: We present a real-time algorithm for emotion-aware navigation of a robot among pedestrians. Our approach estimates time-varying emotional behaviors of pedestrians from their faces and trajectories using a combination of Bayesianinference, CNN-based learning, and the PAD (Pleasure-Arousal-Dominance) model from psychology. These PAD characteristics are used for long-term path prediction and generating proxemic constraints for each pedestrian. We use a multi-channel model to classify pedestrian characteristics in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(10 citation statements)
references
References 38 publications
(49 reference statements)
0
10
0
Order By: Relevance
“…This separation between the robot and people can be limited by the accessibility distance, the user’s comfort distance, and the user’s emotion. Based on these features and the ability of robots to recognise moods or emotional states of people, robots can plan the best routes to follow [ 15 , 43 , 44 ].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…This separation between the robot and people can be limited by the accessibility distance, the user’s comfort distance, and the user’s emotion. Based on these features and the ability of robots to recognise moods or emotional states of people, robots can plan the best routes to follow [ 15 , 43 , 44 ].…”
Section: Related Workmentioning
confidence: 99%
“…Developing systems with this perspective makes the robot able to adapt to social groups of humans [ 12 ]. The detection of groups of people improves the navigation of a social robot in indoor and outdoor environments, and the detection of group emotions allows the robot to improve HRI, exhibiting acceptable social behaviour [ 13 , 14 , 15 , 16 ], as well as associating the group emotion with the scene in which the group is participating. Nevertheless, most existing studies related to detecting group emotions are based on third-person cameras [ 17 , 18 , 19 , 20 , 21 ], but their complexity makes them unsuitable for social robots with egocentric vision due to their sensory capacity.…”
Section: Introductionmentioning
confidence: 99%
“…Their algorithm, however, requires a large number of 3D skeletal key-points to detect emotions and is limited to single individual cases. Bera et al [33] classify emotions based on facial expressions along with a pedestrian trajectory obtained from overhead cameras. Although this technique achieves good accuracy in predicting emotions from trajectories and facial expressions, it explicitly requires overhead cameras in its pipeline and does not work on a mobile robot equipped with only onboards cameras.…”
Section: B Emotion Modeling and Classificationmentioning
confidence: 99%
“…have been used to model social behavior, affective behavior, and personality traits [59]. These approaches have also been used for robot navigation among pedestrians [10,52].…”
Section: Intelligent Virtual Agent Modelingmentioning
confidence: 99%