2021
DOI: 10.1111/jfpe.13866
|View full text |Cite
|
Sign up to set email alerts
|

Apple target recognition method in complex environment based on improved YOLOv4

Abstract: In order to solve the problem of accurate recognition of apples in complex environments, this article proposes an apple recognition method based on improved YOLOv4, which can accurately locate and recognize apples in a variety of complex environments. This method uses the lightweight EfficientNet‐B0 network as the feature extraction network for apple recognition and then combines with the PANet (Path Aggregation Network) network to fuse the features in the adjacent feature layers, which improves the recognitio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
12
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 28 publications
(14 citation statements)
references
References 35 publications
(37 reference statements)
2
12
0
Order By: Relevance
“…Comparing with the methodologies used for broken corn detection at a conveyor belt of a corn harvester based on different YOLO v3 models [15], the proposed YOLO v4 According to the performance of YOLO v4, it achieved a good balance between precision, recall, F1-score and speed which could be considered as the best model for sugar beet damage detection during harvesting. This finding is in line with reported studies that the YOLO networks could achieve higher speed and better overall performance e.g., [31,33,36]. Furthermore, our finding is in agreement with [37] who reported that YOLO models achieved higher speed and F1-score compared with SVM, Faster R-CNN for apple surface defect detection.…”
Section: Resultssupporting
confidence: 93%
See 3 more Smart Citations
“…Comparing with the methodologies used for broken corn detection at a conveyor belt of a corn harvester based on different YOLO v3 models [15], the proposed YOLO v4 According to the performance of YOLO v4, it achieved a good balance between precision, recall, F1-score and speed which could be considered as the best model for sugar beet damage detection during harvesting. This finding is in line with reported studies that the YOLO networks could achieve higher speed and better overall performance e.g., [31,33,36]. Furthermore, our finding is in agreement with [37] who reported that YOLO models achieved higher speed and F1-score compared with SVM, Faster R-CNN for apple surface defect detection.…”
Section: Resultssupporting
confidence: 93%
“…In this study, the YOLO v4 model shows better performance compared to the other developed networks. This finding is in line with previous studies conducted for detection of citrus in an orchard [30], apples in a farming complex environment [31], pests [32] and tree trunks in a forest [33]. However, the Faster R-CNN NAS model shows lower performance in this research.…”
Section: Resultssupporting
confidence: 92%
See 2 more Smart Citations
“…The mAP of the improved model increased by 2.38∼4.81% through the analysis of detection effects under different lighting conditions, occlusion, and maturity. Ji et al (2021) proposed an apple detection method based on the improved YOLOv4, which could accurately locate and detect apples in various complex environments. Although the YOLO series networks have shown excellent performance in fruit recognition, it is difficult to detect small targets in deep feature maps due to the loss of spatial and detailed feature information.…”
Section: Introductionmentioning
confidence: 99%