2022
DOI: 10.1109/tim.2022.3168897
|View full text |Cite
|
Sign up to set email alerts
|

ES-Net: Efficient Scale-Aware Network for Tiny Defect Detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 55 publications
(19 citation statements)
references
References 35 publications
0
19
0
Order By: Relevance
“…The data set used in the experiment were taken by Alibaba Cloud Tianchi [ 30 ] in a textile workshop in Guangdong Province. After manual sorting and selection, 10,321 pictures were selected, including 8 kinds of defects.…”
Section: Methodsmentioning
confidence: 99%
“…The data set used in the experiment were taken by Alibaba Cloud Tianchi [ 30 ] in a textile workshop in Guangdong Province. After manual sorting and selection, 10,321 pictures were selected, including 8 kinds of defects.…”
Section: Methodsmentioning
confidence: 99%
“…Zhang [30] improved the original YOLOv3 by introducing a new migration learning method for detecting concrete bridge defects, and its performance was improved by 13% compared to the original YOLOv3. Yu [31] improved YOLOv4-CSP based on the problem of small targets for industrial defect detection. They proposed an efficient stepped pyramidal network for fusing multi-scale features, thus improving the detection accuracy of small objects.…”
Section: B Deep Learning Methodsmentioning
confidence: 99%
“…With the rise of deep learning, numerous works apply generalized computer vision methods for defect detection. Some works are based on object detection [13,14,28,30,31], which relies on annotated rectangular boxes, enabling locating and classifying defects end-to-end. The other is applying semantic segmentation [5,18,24], which enables pixellevel localization, suit for complex scenarios with difficultto-locate boundaries.…”
Section: Deep Learning-based Defect Localizationmentioning
confidence: 99%