Analysis and Evaluation of the Image Preprocessing Process of a Six-Band Multispectral Camera Mounted on an Unmanned Aerial Vehicle for Winter Wheat Monitoring
Abstract:Unmanned aerial vehicle (UAV)-based multispectral sensors have great potential in crop monitoring due to their high flexibility, high spatial resolution, and ease of operation. Image preprocessing, however, is a prerequisite to make full use of the acquired high-quality data in practical applications. Most crop monitoring studies have focused on specific procedures or applications, and there has been little attempt to examine the accuracy of the data preprocessing steps. This study focuses on the preprocessing… Show more
“…According to the principle of light absorption and reflectance, technologies of spectral analysis, imaging spectroscopy, and other nondestructive methods have been widely used in crop monitoring [4][5][6][7][8]. Combined with the development of airborne or unmanned aerial vehicle (UAV) platforms [9], imaging spectroscopy obtained with high spatial and temporal resolution has become a preferred method and research topic in farmland estimation owing to its advantages of high efficiency and non-invasion [10][11][12]. Thus, this article aims to use the multispectral sensor carried by the UAV to collect maize canopy spectral data in the field and conduct a rapid diagnosis of the chlorophyll content to estimate the growth status and guide the field management.…”
Section: Introductionmentioning
confidence: 99%
“…Most current studies on spectral image focus on the diagnosis of chlorophyll content [11][12][13]. The three directions of these studies include the analysis of spectral response [14][15][16], quantification and selection of sensitive parameters [17,18], and optimization of models [19][20][21][22] on the basis of the visible and near-infrared images.…”
In order to improve the diagnosis accuracy of chlorophyll content in maize canopy, the remote sensing image of maize canopy with multiple growth stages was acquired by using an unmanned aerial vehicle (UAV) equipped with a spectral camera. The dynamic influencing factors of the canopy multispectral images of maize were removed by using different image segmentation methods. The chlorophyll content of maize in the field was diagnosed. The crop canopy spectral reflectance, coverage, and texture information are combined to discuss the different segmentation methods. A full-grown maize canopy chlorophyll content diagnostic model was created on the basis of the different segmentation methods. Results showed that different segmentation methods have variations in the extraction of maize canopy parameters. The wavelet segmentation method demonstrated better advantages than threshold and ExG index segmentation methods. This method segments the soil background, reduces the texture complexity of the image, and achieves satisfactory results. The maize canopy multispectral band reflectance and vegetation index were extracted on the basis of the different segmentation methods. A partial least square regression algorithm was used to construct a full-grown maize canopy chlorophyll content diagnostic model. The result showed that the model accuracy was low when the image background was not removed (Rc2 (the determination coefficient of calibration set) = 0.5431, RMSEF (the root mean squared error of forecast) = 4.2184, MAE (the mean absolute error) = 3.24; Rv2 (the determination coefficient of validation set) = 0.5894, RMSEP (the root mean squared error of prediction) = 4.6947, and MAE = 3.36). The diagnostic accuracy of the chlorophyll content could be improved by extracting the maize canopy through the segmentation method, which was based on the wavelet segmentation method. The maize canopy chlorophyll content diagnostic model had the highest accuracy (Rc2 = 0.6638, RMSEF = 3.6211, MAE = 2.89; Rv2 = 0.6923, RMSEP = 3.9067, and MAE = 3.19). The research can provide a feasible method for crop growth and nutrition monitoring on the basis of the UAV platform and has a guiding significance for crop cultivation management.
“…According to the principle of light absorption and reflectance, technologies of spectral analysis, imaging spectroscopy, and other nondestructive methods have been widely used in crop monitoring [4][5][6][7][8]. Combined with the development of airborne or unmanned aerial vehicle (UAV) platforms [9], imaging spectroscopy obtained with high spatial and temporal resolution has become a preferred method and research topic in farmland estimation owing to its advantages of high efficiency and non-invasion [10][11][12]. Thus, this article aims to use the multispectral sensor carried by the UAV to collect maize canopy spectral data in the field and conduct a rapid diagnosis of the chlorophyll content to estimate the growth status and guide the field management.…”
Section: Introductionmentioning
confidence: 99%
“…Most current studies on spectral image focus on the diagnosis of chlorophyll content [11][12][13]. The three directions of these studies include the analysis of spectral response [14][15][16], quantification and selection of sensitive parameters [17,18], and optimization of models [19][20][21][22] on the basis of the visible and near-infrared images.…”
In order to improve the diagnosis accuracy of chlorophyll content in maize canopy, the remote sensing image of maize canopy with multiple growth stages was acquired by using an unmanned aerial vehicle (UAV) equipped with a spectral camera. The dynamic influencing factors of the canopy multispectral images of maize were removed by using different image segmentation methods. The chlorophyll content of maize in the field was diagnosed. The crop canopy spectral reflectance, coverage, and texture information are combined to discuss the different segmentation methods. A full-grown maize canopy chlorophyll content diagnostic model was created on the basis of the different segmentation methods. Results showed that different segmentation methods have variations in the extraction of maize canopy parameters. The wavelet segmentation method demonstrated better advantages than threshold and ExG index segmentation methods. This method segments the soil background, reduces the texture complexity of the image, and achieves satisfactory results. The maize canopy multispectral band reflectance and vegetation index were extracted on the basis of the different segmentation methods. A partial least square regression algorithm was used to construct a full-grown maize canopy chlorophyll content diagnostic model. The result showed that the model accuracy was low when the image background was not removed (Rc2 (the determination coefficient of calibration set) = 0.5431, RMSEF (the root mean squared error of forecast) = 4.2184, MAE (the mean absolute error) = 3.24; Rv2 (the determination coefficient of validation set) = 0.5894, RMSEP (the root mean squared error of prediction) = 4.6947, and MAE = 3.36). The diagnostic accuracy of the chlorophyll content could be improved by extracting the maize canopy through the segmentation method, which was based on the wavelet segmentation method. The maize canopy chlorophyll content diagnostic model had the highest accuracy (Rc2 = 0.6638, RMSEF = 3.6211, MAE = 2.89; Rv2 = 0.6923, RMSEP = 3.9067, and MAE = 3.19). The research can provide a feasible method for crop growth and nutrition monitoring on the basis of the UAV platform and has a guiding significance for crop cultivation management.
“…While UAV imagery has not previously been used for predicting LAI and chlorophyll content of quinoa plants, it has been used extensively to map LAI and chlorophyll content of other crops. For example, multispectral UAV imagery has been used to predict winter wheat LAI at critical growth stages (Jiang, et al, 2019b), seasonal leaf area dynamics of sorghum breeding lines (Potgieter et al, 2017), soil-plant analysis development (SPAD) measured chlorophyll content values of maize (Deng et al, 2018) and the leaf chlorophyll content of a potato crop (Roosjen et al, 2018). Vegetation indices (VIs) extracted from UAV-based imagery have been the most commonly-used information to predict LAI and chlorophyll content of crops (Jin et al, 2020).…”
Given its high nutritional value and capacity to grow in harsh environments, quinoa has significant potential to address a range of food security concerns. Monitoring the development of phenotypic traits during field trials can provide insights into the varieties best suited to specific environmental conditions and management strategies. Unmanned aerial vehicles (UAVs) provide a promising means for phenotyping and offer the potential for new insights into relative plant performance. During a field trial exploring 141 quinoa accessions, a UAV-based multispectral camera was deployed to retrieve leaf area index (LAI) and SPAD-based chlorophyll across 378 control and 378 saline-irrigated plots using a random forest regression approach based on both individual spectral bands and 25 different vegetation indices (VIs) derived from the multispectral imagery. Results show that most VIs had stronger correlation with the LAI and SPAD-based chlorophyll measurements than individual bands. VIs including the red-edge band had high importance in SPAD-based chlorophyll predictions, while VIs including the near infrared band (but not the red-edge band) improved LAI prediction models. When applied to individual treatments (i.e. control or saline), the models trained using all data (i.e. both control and saline data) achieved high mapping accuracies for LAI (R2 = 0.977–0.980, RMSE = 0.119–0.167) and SPAD-based chlorophyll (R2 = 0.983–0.986, RMSE = 2.535–2.861). Overall, the study demonstrated that UAV-based remote sensing is not only useful for retrieving important phenotypic traits of quinoa, but that machine learning models trained on all available measurements can provide robust predictions for abiotic stress experiments.
“…With the Tetracam Micro-MCA, the sixth channel presents an irradiance sensor, called incident light sensor (ILS), which contains a band pass filter and an optical fiber (Tetracam, 2020). Jiang et al (2019) assessed the Micro-MCA for monitoring winter wheat crops. They performed radiometric calibration on a Micro-MCA and compared the reflectance transformation using panels and ELM versus the direct reflectance using the camera irradiance sensor.…”
Multi-and hyperspectral cameras on drones can be valuable tools in environmental monitoring. A significant shortcoming complicating their usage in quantitative remote sensing applications is insufficient robust radiometric calibration methods. In a direct reflectance transformation method, the drone is equipped with a camera and an irradiance sensor, allowing transformation of image pixel values to reflectance factors without ground reference data. This method requires the sensors to be calibrated with higher accuracy than what is usually required by the empirical line method (ELM), but consequently it offers benefits in robustness, ease of operation, and ability to be used on Beyond-Visual Line of Sight flights. The objective of this study was to develop and assess a drone-based workflow for direct reflectance transformation and implement it on our hyperspectral remote sensing system. A novel atmospheric correction method is also introduced, using two reference panels, but, unlike in the ELM, the correction is not directly affected by changes in the illumination. The sensor system consists of a hyperspectral camera (Rikola HSI, by Senop) and an onboard irradiance spectrometer (FGI AIRS), which were both given thorough radiometric calibrations. In laboratory tests and in a flight experiment, the FGI AIRS tilt-corrected irradiances had accuracy better than 1.9% at solar zenith angles up to 70 • . The system's lowaltitude reflectance factor accuracy was assessed in a flight experiment using reflectance reference panels, where the normalized root mean square errors (NRMSE) were less than ±2% for the light panels (25% and 50%) and less than ±4% for the dark panels (5% and 10%). In the high-altitude images, taken at 100-150 m altitude, the NRMSEs without atmospheric correction were within 1.4%-8.7% for VIS bands and 2.0%-18.5% for NIR bands. Significant atmospheric effects appeared already at 50 m flight altitude. The proposed atmospheric correction was found to be practical and it decreased the high-altitude NRMSEs to 1.3%-2.6% for VIS bands and to 2.3%-5.3% for NIR bands. Overall, the workflow was found to be efficient and to provide similar accuracies as the ELM, but providing operational advantages in such challenging scenarios as in forest monitoring, large-scale autonomous mapping tasks, and real-time applications. Tests in varying illumination conditions showed that the reflectance factors of the gravel and vegetation targets varied up to 8% between sunny and cloudy conditions due to reflectance anisotropy effects, while the direct reflectance workflow had better accuracy. This suggests that the varying illumination conditions have to be further accounted for in drone-based in quantitative remote sensing applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.