2021
DOI: 10.3390/agriculture11111111
|View full text |Cite
|
Sign up to set email alerts
|

Sugar Beet Damage Detection during Harvesting Using Different Convolutional Neural Network Models

Abstract: Mechanical damages of sugar beet during harvesting affects the quality of the final products and sugar yield. The mechanical damage of sugar beet is assessed randomly by operators of harvesters and can depend on the subjective opinion and experience of the operator due to the complexity of the harvester machines. Thus, the main aim of this study was to determine whether a digital two-dimensional imaging system coupled with convolutional neural network (CNN) techniques could be utilized to detect visible mechan… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 18 publications
(11 citation statements)
references
References 33 publications
0
10
0
1
Order By: Relevance
“…They obtained better computational cost and speed with YOLOv4-tiny, concluding that the best model is the YOLOv4 in accuracy and speed. In [13], the authors properly evaluate various detector models based on the CNN, such as YOLO v4, region-based fully convolutional network (R-FCN) and Faster R-CNN, in order to detect visible mechanical damage in sugar beet during harvesting in a harvester machine from RGB videos images. The better experimental results showed a recall, precision and F1-score of about 92, 94 and 93% respectively, and higher speed of around 29 frames per second.…”
Section: Scientific Publicationsmentioning
confidence: 99%
See 1 more Smart Citation
“…They obtained better computational cost and speed with YOLOv4-tiny, concluding that the best model is the YOLOv4 in accuracy and speed. In [13], the authors properly evaluate various detector models based on the CNN, such as YOLO v4, region-based fully convolutional network (R-FCN) and Faster R-CNN, in order to detect visible mechanical damage in sugar beet during harvesting in a harvester machine from RGB videos images. The better experimental results showed a recall, precision and F1-score of about 92, 94 and 93% respectively, and higher speed of around 29 frames per second.…”
Section: Scientific Publicationsmentioning
confidence: 99%
“…Parico et al [12] RGB video YOLOv4, YOLOv4-CSP, YOLOv4-tiny. Nasirahmadi et al [13] RGB video YOLO v4, R-FCN and Faster R-CNN Wang et al [14] RGB video YOLOv3…”
Section: Article Data Type Modelmentioning
confidence: 99%
“…Maximum accuracy of 91.37 percent is reached when evaluating the accuracies of diverse training and testing datasets. In the case of sugar beet, an existing model is upgraded, utilizing the faster region-based CNN architecture by modifying the parameters for recognizing disease-affected regions ( Nasirahmadi et al., 2021 ). The dataset comprises 155 photos of sugar beets, and the proposed framework attained an accuracy rate of 95.48 percent.…”
Section: Related Workmentioning
confidence: 99%
“…The AI and big data support better and precise farm monitoring, data acquisition and analytics, improve information extraction from sensors as well as farm management [11]. For instance, crop health and productivity can be monitored and controlled using advanced AI and deep learning techniques [12]. Datadriven approaches augment on-farm decision-making capabilities, improve crop yield, reduce losses, and therefore, benefit farmers.…”
Section: Introductionmentioning
confidence: 99%