2023
DOI: 10.1016/j.aquaeng.2022.102304
|View full text |Cite
|
Sign up to set email alerts
|

Detection of residual feed in aquaculture using YOLO and Mask RCNN

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(3 citation statements)
references
References 10 publications
0
2
0
Order By: Relevance
“…Despite the model being improved compared with previous methods, the results show that some residual foods are not detected. Fine-tuning applied to the pre-trained model can improve the detection results [27]. Jin-Hyun Park et al proposed a method designed with YOLO to detect and count the number of species that are dangerous and can destroy the ecosystem.…”
Section: Related Workmentioning
confidence: 99%
“…Despite the model being improved compared with previous methods, the results show that some residual foods are not detected. Fine-tuning applied to the pre-trained model can improve the detection results [27]. Jin-Hyun Park et al proposed a method designed with YOLO to detect and count the number of species that are dangerous and can destroy the ecosystem.…”
Section: Related Workmentioning
confidence: 99%
“…Underwater target detection technology is vital not only for military maritime defense tasks [ [1] , [2] , [3] , [4] ] but also for ecological environmental protection [ 5 , 6 ], and critical economic sectors, including fisheries and aquaculture [ [7] , [8] , [9] , [10] ]. This paper focuses on enhancing target detection performance in complex underwater environments.…”
Section: Introductionmentioning
confidence: 99%
“…This model exhibited an average detection time of 3.289 f/s when operating at a resolution of 3000 × 3000. Reference 21 proposed a dual-strategy detection algorithm for feed residues in farming areas. The algorithm is improved based on YOLOv3, while using the mask RCNN.…”
Section: Introductionmentioning
confidence: 99%