ElsevierIvorra Martínez, E.; Sánchez Salmerón, AJ.; Camarasa, J.; Diago, M.; Tardaguila, J. (2015). Assessment of grape cluster yield components based on 3D descriptors using stereo vision. Food The purpose of this paper is to propose a three-dimensional computer vision approach to 30 assessing grape yield components based on new 3D descriptors. To achieve this, firstly a partial 31 three-dimensional model of the grapevine cluster is extracted using stereo vision. After that a 32 number of grapevine quality components are predicted using SVM models based on new 3D 33 descriptors. Experiments confirm that this approach is capable of predicting the main cluster 34 yield components, which are related to quality, such as cluster compactness and berry size 35 (R 2 > 0.80, p <0.05). In addition, other yield components: cluster volume, total berry weight 36 and number of berries, were also estimated using SVM models, obtaining prediction R 2 of 0.82, 37 0.83 and 0.71, respectively. 38 39
Automated lifespan determination for C. elegans cultured in standard Petri dishes is challenging. Problems include occlusions of Petri dish edges, aggregation of worms, and accumulation of dirt (dust spots on lids) during assays, etc. This work presents a protocol for a lifespan assay, with two image-processing pipelines applied to different plate zones, and a new data post-processing method to solve the aforementioned problems. Specifically, certain steps in the culture protocol were taken to alleviate aggregation, occlusions, contamination, and condensation problems. This method is based on an active illumination system and facilitates automated image sequence analysis, does not need human threshold adjustments, and simplifies the techniques required to extract lifespan curves. In addition, two image-processing pipelines, applied to different plate zones, were employed for automated lifespan determination. The first image-processing pipeline was applied to a wall zone and used only pixel level information because worm size or shape features were unavailable in this zone. However, the second image-processing pipeline, applied to the plate centre, fused information at worm and pixel levels. Simple death event detection was used to automatically obtain lifespan curves from the image sequences that were captured once daily throughout the assay. Finally, a new post-processing method was applied to the extracted lifespan curves to filter errors. The experimental results showed that the errors in automated counting of live worms followed the Gaussian distribution with a mean of 2.91% and a standard deviation of ±12.73% per Petri plate. Post-processing reduced this error to 0.54 ± 8.18% per plate. The automated survival curve incurred an error of 4.62 ± 2.01%, while the post-process method reduced the lifespan curve error to approximately 2.24 ± 0.55%.
In recent years, the benefits of both Augmented Reality (AR) technology and infrared thermography (IRT) have been demonstrated in the industrial maintenance sector, allowing maintenance operations to be carried out in a safer, faster, and more efficient manner. However, there still exists no solution that optimally combines both technologies. In this work, we propose a new AR system—MANTRA—with specific application to industrial maintenance. The system can automatically align virtual information and temperature on any 3D object, in real time. This is achieved through the joint use of an RGB-D sensor and an IRT camera, leading to high accuracy and robustness. To achieve this objective, a pose estimation method that combines a deep-learning-based object detection method, YOLOV4, together with the template-based LINEMOD pose estimation method, as well as a model-based 6DOF pose tracking technique, was developed. The MANTRA system is validated both quantitatively and qualitatively through a real use-case, demonstrating the effectiveness of the system compared to traditional methods and those using only AR.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.