Over the last decade, the use of unmanned aerial vehicle (UAV) technology has evolved significantly in different applications as it provides a special platform capable of combining the benefits of terrestrial and aerial remote sensing. Therefore, such technology has been established as an important source of data collection for different precision agriculture (PA) applications such as crop health monitoring and weed management. Generally, these PA applications depend on performing a vegetation segmentation process as an initial step, which aims to detect the vegetation objects in collected agriculture fields’ images. The main result of the vegetation segmentation process is a binary image, where vegetations are presented in white color and the remaining objects are presented in black. Such process could easily be performed using different vegetation indexes derived from multispectral imagery. Recently, to expand the use of UAV imagery systems for PA applications, it was important to reduce the cost of such systems through using low-cost RGB cameras Thus, developing vegetation segmentation techniques for RGB images is a challenging problem. The proposed paper introduces a new vegetation segmentation methodology for low-cost UAV RGB images, which depends on using Hue color channel. The proposed methodology follows the assumption that the colors in any agriculture field image can be distributed into vegetation and non-vegetations colors. Therefore, four main steps are developed to detect five different threshold values using the hue histogram of the RGB image, these thresholds are capable to discriminate the dominant color, either vegetation or non-vegetation, within the agriculture field image. The achieved results for implementing the proposed methodology showed its ability to generate accurate and stable vegetation segmentation performance with mean accuracy equal to 87.29% and standard deviation as 12.5%.
<p><strong>Abstract.</strong> The use of Unmanned Aerial Vehicle (UAV) imagery systems for Precision Agriculture (PA) applications drew a lot of attention through the last decade. UAV as a platform for an imagery sensor is providing a major advantage as it can provide high spatial resolution images compared to satellite platform. Moreover, it provides the user with the ability to collect the needed images at any time along with the ability to cover the agriculture fields faster than terrestrial platform. Therefore, such UAV imagery systems are capable to fit the gap between aerial and terrestrial Remote Sensing. One of the important PA applications that using UAV imagery system for it showed great potentials is weed management and more specifically the weed detection step. The current weed management procedure depends on spraying the whole agriculture field with chemical herbicides to execute any weed plants in the field. Although such procedure seems to be effective, it has huge effect on the surrounding environment due to the excessive use of the chemical, especially that weed plants don’t cover the whole field. Usually weed plants spread through only few spots of the field. Therefore, different efforts were introduced to develop weed detection techniques using UAV imagery systems. Though the different advantages of the UAV imagery systems, they systems didn’t draw the users interest due to many limitations including the cost of the system. Therefore, the proposed paper introduces a new weed detection methodology from RGB images acquired by low-cost UAV imagery system. The proposed methodology adopts detecting the high-density vegetation spots as indication for weed patches spots. The achieved results showed the potential of the proposed methodology to use low-cost UAV imagery system equipped with low-cost RGB imagery sensor for detecting weed patches in different cropped agriculture fields even from different flight height as 20, 40, 80, and 120 meters.</p>
<p><strong>Abstract.</strong> Precision Agriculture (PA) management systems are considered among the top ten revolutions in the agriculture industry during the last couple decades. Generally, the PA is a management system that aims to integrate different technologies as navigation and imagery systems to control the use of the agriculture industry inputs aiming to enhance the quality and quantity of its output, while preserving the surrounding environment from any harm that might be caused due to the use of these inputs. On the other hand, during the last decade, Unmanned Aerial Vehicles (UAVs) showed great potential to enhance the use of remote sensing and imagery sensors for different PA applications such as weed management, crop health monitoring, and crop row detection. UAV imagery systems are capable to fill the gap between aerial and terrestrial imagery systems and enhance the use of imagery systems and remote sensing for PA applications. One of the important PA applications that uses UAV imagery systems, and which drew lots of interest is the crop row detection, especially that such application is important for other applications such as weed detection and crop yield predication. This paper introduces a new crop row detection methodology using low-cost UAV RGB imagery system. The methodology has three main steps. First, the RGB images are converted into HSV color space and the Hue image are extracted. Then, different sections are generated with different orientation angles in the Hue images. For each section, using the PCA of the Hue values in the section, an analysis can be performed to evaluate the variances of the Hue values in the section. The crop row orientation angle is detected as the same orientation angle of the section that provides the minimum variances of Hue values. Finally, a scan line is generated over the Hue image with the same orientation angle of the crop rows. The scan line computes the average of the Hue values for each line in the Hue image similar to the detected crop row orientation. The generated values provide a graph full of peaks and valleys which represent the crop and soil rows. The proposed methodology was evaluated using different RGB images acquired by low-cost UAV for a Canola field. The images were taken at different flight heights and different dates. The achieved results proved the ability of the proposed methodology to detect the crop rows at different cases.</p>
ABSTRACT:In the last few years, multi-cameras and LIDAR systems draw the attention of the mapping community. They have been deployed on different mobile mapping platforms. The different uses of these platforms, especially the UAVs, offered new applications and developments which require fast and accurate results. The successful calibration of such systems is a key factor to achieve accurate results and for the successful processing of the system measurements especially with the different types of measurements provided by the LIDAR and the cameras. The system calibration aims to estimate the geometric relationships between the different system components. A number of applications require the systems be ready for operation in a short time especially for disasters monitoring applications. Also, many of the present system calibration techniques are constrained with the need of special arrangements in labs for the calibration procedures. In this paper, a new technique for calibration of integrated LIDAR and multi-cameras systems is presented. The new proposed technique offers a calibration solution that overcomes the need for special labs for standard calibration procedures. In the proposed technique, 3D reconstruction of automatically detected and matched image points is used to generate a sparse imagesdriven point cloud then, a registration between the LIDAR generated 3D point cloud and the images-driven 3D point takes place to estimate the geometric relationships between the cameras and the LIDAR.. In the presented technique a simple 3D artificial target is used to simplify the lab requirements for the calibration procedure. The used target is composed of three intersected plates. The choice of such target geometry was to ensure enough conditions for the convergence of registration between the constructed 3D point clouds from the two systems. The achieved results of the proposed approach prove its ability to provide an adequate and fully automated calibration without sophisticated calibration arrangement requirements. The proposed technique introduces high potential for system calibration for many applications especially those with critical logistic and time constraints such as in disaster monitoring applications.
This article introduces a business intelligence framework to manage water resources, relying on a design science research methodology, data warehousing platform, and Power BI visualization. This framework consists of three phases. The data preprocessing phase comprises the core functions to validate, clean, transform, aggregate, and load data for the next phase. The processing phase includes the warehousing platform that organizes the data, builds a relationship among these data, and exports them in an appropriate format. Finally, the visualization phase generates descriptive reports for data after integrating the exported warehousing platform with the Power BI interface throughout multi-filters. For validating the performance of this framework, a heuristic evaluation was conducted, showing users' feedback regarding the framework, and a usability evaluation was conducted to determine any major issues such as visibility, flexibility, learnability, and operability. The results indicated that the tool allows for improved decision-making and transforming raw data into useful information. In addition, the dashboards provide a good way to visualize data and identify trends. These aspects align directly with the requirements for a BI tool to support strategic water resources management in Egypt. For example, agriculture consumes 46.49 billion cubic meters of water annually, accounting for nearly 60% of Egypt's total water consumption. Therefore, it is the first area of rationalization and consideration. AlShirkia governorate consumes the most water in agriculture, with an annual consumption value of approximately 4 billion 631 million cubic meters, 8 million people, 1.5 million acres, and 11.3 million tons of production.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.