2020
DOI: 10.1007/s10163-020-01098-z
|View full text |Cite
|
Sign up to set email alerts
|

Deep learning of grasping detection for a robot used in sorting construction and demolition waste

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 49 publications
(18 citation statements)
references
References 26 publications
0
17
0
1
Order By: Relevance
“…This section summarizes and classifies the robot hands and grippers in terms of the mechanism, material, and manipulation strategy, which are employed for sorting manipulation. Furthermore, robotic sorting manipulation methods [95], [96], [97], [98], [99], [100], [101] specific to the end-effectors are explored.…”
Section: A End-effectors and Manipulationmentioning
confidence: 99%
“…This section summarizes and classifies the robot hands and grippers in terms of the mechanism, material, and manipulation strategy, which are employed for sorting manipulation. Furthermore, robotic sorting manipulation methods [95], [96], [97], [98], [99], [100], [101] specific to the end-effectors are explored.…”
Section: A End-effectors and Manipulationmentioning
confidence: 99%
“…All these algorithms were adopted in solid waste classification shortly after having been launched. Relevant practices are presented in Table , and wastes were involved which included domestic wastes, CDW, and WEEE and achieved inspired results. Faster R-CNN and Mask R-CNN have been prevailing because of high accuracy and detection speed.…”
Section: Development and Status Quo Of Sensor-based Waste Sorting Tec...mentioning
confidence: 99%
“…Hyperspectral imaging has found widespread acceptance as a tool in machine vision for classification of fruits, waste products, and in many other organic and inorganic items Barnabé et al (2015); Ku et al (2021); Kwak et al (2021). These approaches demonstrate the diversity of applications the technology is relevant to, but lack generality to multiple robotic problems.…”
Section: Spectroscopy In Robotsmentioning
confidence: 99%
“…The first sensor, a Time of Flight (ToF) depth camera (Microsoft). At the nominal operating height of 0.7 m above the workcell surface, the ToF camera was estimated to have a systematic spatial error ≤ 1 mm, making it perfect for perceiving the presence of small objects in scene Kurillo et al (2022). The camera outputs registered RGB and depth images; the latter of which can easily be projected into 3D space…”
Section: Perceptionmentioning
confidence: 99%
See 1 more Smart Citation