2022
DOI: 10.1080/01691864.2022.2115315
|View full text |Cite
|
Sign up to set email alerts
|

Solution of World Robot Challenge 2020 Partner Robot Challenge (Real Space)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
0
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 27 publications
0
0
0
Order By: Relevance
“…In our previous research, we created one that utilizes YolactEdge [11] and point cloud information. This recognition technology can estimate the position and posture of an object [12].…”
Section: Previous Research 21 Object Recognition and Estimate Positionmentioning
confidence: 99%
“…In our previous research, we created one that utilizes YolactEdge [11] and point cloud information. This recognition technology can estimate the position and posture of an object [12].…”
Section: Previous Research 21 Object Recognition and Estimate Positionmentioning
confidence: 99%
“…Recently, the demand for home service robots has been increasing due to a low birth rate and an aging population, and research on such robots has been active [1], [2], [3], [4], [5], [6]. Highly accurate task planning is necessary to realize a general-purpose service robot that performs appropriate actions in response to human requests.…”
Section: Introductionmentioning
confidence: 99%
“…These robots operate in dynamic environments, frequently encountering new and unfamiliar objects, making the learning process of recognition technologies and operational efficiency crucial. Ono et al's research [4] automates dataset generation and annotation, reducing the time and cost involved in the learning process while maintaining real-time performance using You Only Look Once v4 (YOLOv4) [5]. However, their experiments [6] showed that YOLO learning took two hours to prepare 500,000 training images and 18 hours.…”
Section: Introductionmentioning
confidence: 99%