Traditional imagery-provided, for example, by RGB and/or NIR sensors-has proven to be useful in many agroforestry applications. However, it lacks the spectral range and precision to profile materials and organisms that only hyperspectral sensors can provide. This kind of high-resolution spectroscopy was firstly used in satellites and later in manned aircraft, which are significantly expensive platforms and extremely restrictive due to availability limitations and/or complex logistics. More recently, UAS have emerged as a very popular and cost-effective remote sensing technology, composed of aerial platforms capable of carrying small-sized and lightweight sensors. Meanwhile, hyperspectral technology developments have been consistently resulting in smaller and lighter sensors that can currently be integrated in UAS for either scientific or commercial purposes. The hyperspectral sensors' ability for measuring hundreds of bands raises complexity when considering the sheer quantity of acquired data, whose usefulness depends on both calibration and corrective tasks occurring in pre-and post-flight stages. Further steps regarding hyperspectral data processing must be performed towards the retrieval of relevant information, which provides the true benefits for assertive interventions in agricultural crops and forested areas. Considering the aforementioned topics and the goal of providing a global view focused on hyperspectral-based remote sensing supported by UAV platforms, a survey including hyperspectral sensors, inherent data processing and applications focusing both on agriculture and forestry-wherein the combination of UAV and hyperspectral sensors plays a center role-is presented in this paper. Firstly, the advantages of hyperspectral data over RGB imagery and multispectral data are highlighted. Then, hyperspectral acquisition devices are addressed, including sensor types, acquisition modes and UAV-compatible sensors that can be used for both research and commercial purposes. Pre-flight operations and post-flight pre-processing are pointed out as necessary to ensure the usefulness of hyperspectral data for further processing towards the retrieval of conclusive information. With the goal of simplifying hyperspectral data processing-by isolating the common user from the processes' mathematical complexity-several available toolboxes that allow a direct access to level-one hyperspectral data are presented. Moreover, research works focusing the symbiosis between UAV-hyperspectral for agriculture and forestry applications are reviewed, just before the paper's conclusions.
Currently, climate change poses a global threat, which may compromise the sustainability of agriculture, forestry and other land surface systems. In a changing world scenario, the economic importance of Remote Sensing (RS) to monitor forests and agricultural resources is imperative to the development of agroforestry systems. Traditional RS technologies encompass satellite and manned aircraft platforms. These platforms are continuously improving in terms of spatial, spectral, and temporal resolutions. The high spatial and temporal resolutions, flexibility and lower operational costs make Unmanned Aerial Vehicles (UAVs) a good alternative to traditional RS platforms. In the management process of forests resources, UAVs are one of the most suitable options to consider, mainly due to: (1) low operational costs and high-intensity data collection; (2) its capacity to host a wide range of sensors that could be adapted to be task-oriented; (3) its ability to plan data acquisition campaigns, avoiding inadequate weather conditions and providing data availability on-demand; and (4) the possibility to be used in real-time operations. This review aims to present the most significant UAV applications in forestry, identifying the appropriate sensors to be used in each situation as well as the data processing techniques commonly implemented.
Unmanned aerial vehicles have become a popular remote sensing platform for agricultural applications, with an emphasis on crop monitoring. Although there are several methods to detect vegetation through aerial imagery, these remain dependent of manual extraction of vegetation parameters. This article presents an automatic method that allows for individual tree detection and multi-temporal analysis, which is crucial in the detection of missing and new trees and monitoring their health conditions over time. The proposed method is based on the computation of vegetation indices (VIs), while using visible (RGB) and near-infrared (NIR) domain combination bands combined with the canopy height model. An overall segmentation accuracy above 95% was reached, even when RGB-based VIs were used. The proposed method is divided in three major steps: (1) segmentation and first clustering; (2) cluster isolation; and (3) feature extraction. This approach was applied to several chestnut plantations and some parameters—such as the number of trees present in a plantation (accuracy above 97%), the canopy coverage (93% to 99% accuracy), the tree height (RMSE of 0.33 m and R2 = 0.86), and the crown diameter (RMSE of 0.44 m and R2 = 0.96)—were automatically extracted. Therefore, by enabling the substitution of time-consuming and costly field campaigns, the proposed method represents a good contribution in managing chestnut plantations in a quicker and more sustainable way.
This study aimed to characterize vineyard vegetation thorough multi-temporal monitoring using a commercial low-cost rotary-wing unmanned aerial vehicle (UAV) equipped with a consumer-grade red/green/blue (RGB) sensor. Ground-truth data and UAV-based imagery were acquired on nine distinct dates, covering the most significant vegetative growing cycle until harvesting season, over two selected vineyard plots. The acquired UAV-based imagery underwent photogrammetric processing resulting, per flight, in an orthophoto mosaic, used for vegetation estimation. Digital elevation models were used to compute crop surface models. By filtering vegetation within a given height-range, it was possible to separate grapevine vegetation from other vegetation present in a specific vineyard plot, enabling the estimation of grapevine area and volume. The results showed high accuracy in grapevine detection (94.40%) and low error in grapevine volume estimation (root mean square error of 0.13 m and correlation coefficient of 0.78 for height estimation). The accuracy assessment showed that the proposed method based on UAV-based RGB imagery is effective and has potential to become an operational technique. The proposed method also allows the estimation of grapevine areas that can potentially benefit from canopy management operations.
To differentiate between canopy and vegetation cover is particularly challenging. Nonetheless, it is pivotal in obtaining the exact crops' vegetation when using remote-sensing data. In this article, a method to automatically estimate and extract vineyards' canopy is proposed. It combines vegetation indices and digital elevation modelsderived from high-resolution images, acquired using unmanned aerial vehiclesto differentiate between vines' canopy and inter-row vegetation cover. This enables the extraction of relevant information from a specific vineyard plot. The proposed method was applied to data acquired from some vineyards located in Portugal's north-eastern region, and the resulting parameters were validated. It proved to be an effective method when applied with consumer-grade sensors, carried by unmanned aerial vehicles. Moreover, it also proved to be a fast and efficient way to extract vineyard information, enabling vineyard plots mapping for precision viticulture management tasks.
Traditionally farmers have used their perceptual sensorial systems to diagnose and monitor their crops health and needs. However, humans possess five basic perceptual systems with accuracy levels that can change from human to human which are largely dependent on the stress, experience, health and age. To overcome this problem, in the last decade, with the help of the emergence of smartphone technology, new agronomic applications were developed to reach better, cost-effective, more accurate and portable diagnosis systems. Conventional smartphones are equipped with several sensors that could be useful to support near real-time usual and advanced farming activities at a very low cost. Therefore, the development of agricultural applications based on smartphone devices has increased exponentially in the last years. However, the great potential offered by smartphone applications is still yet to be fully realized. Thus, this paper presents a literature review and an analysis of the characteristics of several mobile applications for use in smart/precision agriculture available on the market or developed at research level. This will contribute to provide to farmers an overview of the applications type that exist, what features they provide and a comparison between them. Also, this paper is an important resource to help researchers and applications developers to understand the limitations of existing tools and where new contributions can be performed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.