The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2023
DOI: 10.3390/s23062907
|View full text |Cite
|
Sign up to set email alerts
|

ARTD-Net: Anchor-Free Based Recyclable Trash Detection Net Using Edgeless Module

Abstract: Due to the sharp increase in household waste, its separate collection is essential in order to reduce the huge amount of household waste, since it is difficult to recycle trash without separate collection. However, since it is costly and time-consuming to separate trash manually, it is crucial to develop an automatic system for separate collection using deep learning and computer vision. In this paper, we propose two Anchor-free-based Recyclable Trash Detection Networks (ARTD-Net) which can recognize overlappe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 40 publications
0
1
0
Order By: Relevance
“…The smaller YOLOv6 models (nano, tiny, and small) harness reparameterized VGG networks (RepBlock) with skip connections for training, which transition to simple 3x3 convolutional (RepConv) blocks during inference [22]. Meanwhile, the medium and large YOLOv6 models deploy reparameterized versions of the CSP backbone, termed CSP-StackRep, culminating in the EfficientRep backbone [21], [23].…”
Section: ) Yolov5mentioning
confidence: 99%
“…The smaller YOLOv6 models (nano, tiny, and small) harness reparameterized VGG networks (RepBlock) with skip connections for training, which transition to simple 3x3 convolutional (RepConv) blocks during inference [22]. Meanwhile, the medium and large YOLOv6 models deploy reparameterized versions of the CSP backbone, termed CSP-StackRep, culminating in the EfficientRep backbone [21], [23].…”
Section: ) Yolov5mentioning
confidence: 99%