2023
DOI: 10.1016/j.eja.2023.126845
|View full text |Cite
|
Sign up to set email alerts
|

Maize seedling information extraction from UAV images based on semi-automatic sample generation and Mask R-CNN model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 26 publications
0
3
0
Order By: Relevance
“…Previous studies in crop quantity estimation with ultrahigh-definition digital images primarily used deep learning techniques. For instance, Vong et al and Gao et al [35] employed the U-Net and Mask R-CNN models, respectively, for image segmentation. In another approach involving the use of object detection techniques in deep learning, Liu et al [34] selected YOLOv3, while Xu et al [36] and Cardellicchio et al [67] opted for YOLOv5 for accurate crop quantity estimations.…”
Section: Advantages Of the Proposed Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Previous studies in crop quantity estimation with ultrahigh-definition digital images primarily used deep learning techniques. For instance, Vong et al and Gao et al [35] employed the U-Net and Mask R-CNN models, respectively, for image segmentation. In another approach involving the use of object detection techniques in deep learning, Liu et al [34] selected YOLOv3, while Xu et al [36] and Cardellicchio et al [67] opted for YOLOv5 for accurate crop quantity estimations.…”
Section: Advantages Of the Proposed Methodsmentioning
confidence: 99%
“…Especially in terms of the estimations of maize quantities, Liu et al [34] used you only look once (YOLO) v3 to estimate the maize plant counts with a precision of 96.99%. Gao et al [35] fine-tuned the Mask Region-convolutional neural network (R-CNN) model for automatic maize identification, achieving an average precision (AP) @ 0.5 intersection over union (IOU) of 0.729 and an average recall (AR)@ 0.5IOU of 0.837. Xu et al [36], Xiao et al [22], and others used the YOLOv5 model to estimate the maize plant counts from UAV digital images.…”
Section: Introductionmentioning
confidence: 99%
“…In the equation, P represents Precision, and R represents Recall. TP (True Positive) represents true positives, indicating the number of positive samples correctly predicted as positive; TN (True Negative) represents true negatives, indicating the number of negative samples correctly predicted as negative; FP (False Positive) represents false positives, indicating the number of negative samples incorrectly predicted as positive; and FN (False Negative) represents false negatives, indicating the number of positive samples incorrectly predicted as negative [26,27]. The calculation of each evaluation metric is as follows:…”
Section: Evaluation Indicatorsmentioning
confidence: 99%
“…In the wake of deep learning, scholars have increasingly applied convolutional neural networks (CNNs) in agronomy research, with relatively good results in the identification of pests and diseases [32], classification and screening of weeds in the field [33], collection of key information about a crop [34], and crop growth prediction [35]. For instance, using a fine-tuned Mask region-based CNN (R-CNN) model, Gao et al [36] proposed an automatic maize seedling identification method that is adaptable to various developmental stages of maize seedlings to quickly and accurately extract phenotypic information from field environments while reducing labor costs. The model had an average detection accuracy of 88.70%, with average accuracies of detection of seedling emergence in 2019, 2020, and 2021 of 98.87%, 95.70%, and 98.77%, respectively.…”
Section: Introductionmentioning
confidence: 99%