2019
DOI: 10.1007/s00170-019-04293-x
|View full text |Cite
|
Sign up to set email alerts
|

Automated visual positioning and precision placement of a workpiece using deep learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
9
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 37 publications
(12 citation statements)
references
References 19 publications
0
9
0
1
Order By: Relevance
“…Segmenting the workpiece area from the nonworkpiece area in the point cloud data helps to increase the number of detected workpieces and estimate the correct object poses [17]. Using the position image of the workpieces to train the CNN to push the position deviation of the mechanical part for accurately placing the workpieces on the fixture [18]. Thanks to the powerful fitting capabilities of various deep learning network frameworks, although workpiece detection tasks are more diverse and complex, they can do tasks that traditional algorithms cannot.…”
Section: Related Workmentioning
confidence: 99%
“…Segmenting the workpiece area from the nonworkpiece area in the point cloud data helps to increase the number of detected workpieces and estimate the correct object poses [17]. Using the position image of the workpieces to train the CNN to push the position deviation of the mechanical part for accurately placing the workpieces on the fixture [18]. Thanks to the powerful fitting capabilities of various deep learning network frameworks, although workpiece detection tasks are more diverse and complex, they can do tasks that traditional algorithms cannot.…”
Section: Related Workmentioning
confidence: 99%
“…The performance evaluation results of the proposed algorithm have shown accurate predictions in detecting tool wear under various cutting conditions with a high-speed response rate. Li and Chang [4] proposed an automated visual positioning system for precision placement of a workpiece on the xture. The experimental evidence of workpiece placement con rms that the low-resolution (640× 480 pixels) camera can obtain a translational precision of ± 0.2 mm; by using the binocular system can control the rotational error within ± 0.1°.…”
Section: Introductionmentioning
confidence: 99%
“…2 (b). In those tasks, the 3D vision sensors can be utilized to capture the point cloud data (PCD) and estimate the pose of parts through the iterative closest point (ICP) algorithm [14] or using neural networks [11,13,28].…”
Section: Introductionmentioning
confidence: 99%