Cereals are the main food for mankind. The grain shape extraction and filled/unfilled grain recognition are meaningful for crop breeding and genetic analysis. The conventional measuring method is mainly manual, which is inefficient, labor-intensive and subjective. Therefore, a novel method was proposed to extract the phenotypic traits of cereal grains based on point clouds. First, a structured light scanner was used to obtain the grains point cloud data. Then, the single grain segmentation was accomplished by image preprocessing, plane fitting, region growth clustering. The length, width, thickness, surface area and volume was calculated by the specified analysis algorithms for grain point cloud. To demonstrate this method, experimental materials included rice, wheat and corn were tested. Compared with manual measurement results, the average measurement error of grain length, width and thickness was 2.07%, 0.97%, 1.13%, and the average measurement efficiency was about 9.6 s per grain. In addition, the grain identification model was conducted with 25 grain phenotypic traits, using 6 machine learning methods. The results showed that the best accuracy for filled/unfilled grain classification was 90.184%.The best accuracy for indica and japonica identification was 99.950%, while for different varieties identification was only 47.252%. Therefore, this method was proved to be an efficient and effective way for crop research.
High-throughput phenotyping of yield-related traits is meaningful and necessary for rice breeding and genetic study. The conventional method for rice yield-related trait evaluation faces the problems of rice threshing difficulties, measurement process complexity, and low efficiency. To solve these problems, a novel intelligent system, which includes an integrated threshing unit, grain conveyor-imaging units, threshed panicle conveyor-imaging unit, and specialized image analysis software has been proposed to achieve rice yield trait evaluation with high throughput and high accuracy. To improve the threshed panicle detection accuracy, the Region of Interest Align, Convolution Batch normalization activation with Leaky Relu module, Squeeze-and-Excitation unit, and optimal anchor size have been adopted to optimize the Faster-RCNN architecture, termed ‘TPanicle-RCNN,’ and the new model achieved F1 score 0.929 with an increase of 0.044, which was robust to indica and japonica varieties. Additionally, AI cloud computing was adopted, which dramatically reduced the system cost and improved flexibility. To evaluate the system accuracy and efficiency, 504 panicle samples were tested, and the total spikelet measurement error decreased from 11.44 to 2.99% with threshed panicle compensation. The average measuring efficiency was approximately 40 s per sample, which was approximately twenty times more efficient than manual measurement. In this study, an automatic and intelligent system for rice yield-related trait evaluation was developed, which would provide an efficient and reliable tool for rice breeding and genetic research.
The wheat grain three-dimensional (3D) phenotypic characters are of great significance for final yield and variety breeding, and the ventral sulcus traits are the important factors to the wheat flour yield. The wheat grain trait measurements are necessary; however, the traditional measurement method is still manual, which is inefficient, subjective, and labor intensive; moreover, the ventral sulcus traits can only be obtained by destructive measurement. In this paper, an intelligent analysis method based on the structured light imaging has been proposed to extract the 3D wheat grain phenotypes and ventral sulcus traits. First, the 3D point cloud data of wheat grain were obtained by the structured light scanner, and then, the specified point cloud processing algorithms including single grain segmentation and ventral sulcus location have been designed; finally, 28 wheat grain 3D phenotypic characters and 4 ventral sulcus traits have been extracted. To evaluate the best experimental conditions, three-level orthogonal experiments, which include rotation angle, scanning angle, and stage color factors, were carried out on 125 grains of 5 wheat varieties, and the results demonstrated that optimum conditions of rotation angle, scanning angle, and stage color were 30°, 37°, black color individually. Additionally, the results also proved that the mean absolute percentage errors (MAPEs) of wheat grain length, width, thickness, and ventral sulcus depth were 1.83, 1.86, 2.19, and 4.81%. Moreover, the 500 wheat grains of five varieties were used to construct and validate the wheat grain weight model by 32 phenotypic traits, and the cross-validation results showed that the R2 of the models ranged from 0.77 to 0.83. Finally, the wheat grain phenotype extraction and grain weight prediction were integrated into the specialized software. Therefore, this method was demonstrated to be an efficient and effective way for wheat breeding research.
Verticillium wilt is one of the most critical cotton diseases, which is widely distributed in cotton-producing countries. However, the conventional method of verticillium wilt investigation is still manual, which has the disadvantages of subjectivity and low efficiency. In this research, an intelligent vision-based system was proposed to dynamically observe cotton verticillium wilt with high accuracy and high throughput. Firstly, a 3-coordinate motion platform was designed with the movement range 6,100 mm × 950 mm × 500 mm, and a specific control unit was adopted to achieve accurate movement and automatic imaging. Secondly, the verticillium wilt recognition was established based on 6 deep learning models, in which the VarifocalNet (VFNet) model had the best performance with a mean average precision ( mAP ) of 0.932. Meanwhile, deformable convolution, deformable region of interest pooling, and soft non-maximum suppression optimization methods were adopted to improve VFNet, and the mAP of the VFNet-Improved model improved by 1.8%. The precision–recall curves showed that VFNet-Improved was superior to VFNet for each category and had a better improvement effect on the ill leaf category than fine leaf. The regression results showed that the system measurement based on VFNet-Improved achieved high consistency with manual measurements. Finally, the user software was designed based on VFNet-Improved, and the dynamic observation results proved that this system was able to accurately investigate cotton verticillium wilt and quantify the prevalence rate of different resistant varieties. In conclusion, this study has demonstrated a novel intelligent system for the dynamic observation of cotton verticillium wilt on the seedbed, which provides a feasible and effective tool for cotton breeding and disease resistance research.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.