2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2023
DOI: 10.1109/iros55552.2023.10342447
|View full text |Cite
|
Sign up to set email alerts
|

Toward Human-Like Social Robot Navigation: A Large-Scale, Multi-Modal, Social Human Navigation Dataset

Duc M. Nguyen,
Mohammad Nazeri,
Amirreza Payandeh
et al.
Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 23 publications
0
0
0
Order By: Relevance
“…Recently, new datasets have emerged, for example, SiT [363], which contains indoor and outdoor recordings collected while the robot navigated in a crowded environment, capturing dense human-robot interactive dynamic scenarios with annotated pedestrian information. Nguyen et al [364] developed MuSoHu dataset gathering recordings of sensors placed on human participants walking in human-occupied spaces; thus, interactions between robots and humans have not been captured. Hirose et al [134] presented HuRoN dataset collected with multimodal sensory data from a robot operating with an autonomous policy interacting with humans in real-world scenes.…”
Section: Datasetsmentioning
confidence: 99%
“…Recently, new datasets have emerged, for example, SiT [363], which contains indoor and outdoor recordings collected while the robot navigated in a crowded environment, capturing dense human-robot interactive dynamic scenarios with annotated pedestrian information. Nguyen et al [364] developed MuSoHu dataset gathering recordings of sensors placed on human participants walking in human-occupied spaces; thus, interactions between robots and humans have not been captured. Hirose et al [134] presented HuRoN dataset collected with multimodal sensory data from a robot operating with an autonomous policy interacting with humans in real-world scenes.…”
Section: Datasetsmentioning
confidence: 99%