Land cover maps are indispensable for decision making, monitoring, and management in agricultural areas, but they are often only available after harvesting. To obtain a timely crop map of a small-scale arable landscape in the Swiss Plateau, we acquired uncalibrated, very high-resolution data, with a spatial resolution of 0.05 m and four spectral bands, using a consumer-grade camera on an unmanned aerial vehicle (UAV) in June 2015. We resampled the data to different spatial and spectral resolutions, and evaluated the method using textural features (first order statistics and mathematical morphology), a random forest classifier for best performance, as well as number and size of the structuring elements. Our main findings suggest the overall best performing data consisting of a spatial resolution of 0.5 m, three spectral bands (RGB—red, green, and blue), and five different sizes of the structuring elements. The overall accuracy (OA) for the full set of crop classes based on a pixel-based classification is 66.7%. In case of a merged set of crops, the OA increases by ~7% (74.0%). For an object-based classification based on individual field parcels, the OA increases by ~20% (OA of 86.3% for the full set of crop classes, and 94.6% for the merged set, respectively). We conclude the use of UAV to be most relevant at 0.5 m spatial resolution in heterogeneous arable landscapes when used for crop classification.
This paper shows three experiments from our HyperGreding'19 campaign that combine multitemporal hyperspectral data to address several essential questions in target detection. The experiments were conducted over Greding, Germany, using a Headwall VNIR/SWIR co-aligned sensor mounted on a drone with a flight altitude of 80 m. Additionally, high-resolution aerial RGB data, GPS measurements, and reference data from a field spectrometer were recorded to support the hyperspectral data pre-processing and the evaluation process for the individual experiments. The focus of the experiments is the detectability of camouflage materials and camouflaged objects. When the goal is to transfer hyperspectral analysis to a practical setting, the analysis must be robust regarding realistic and changing conditions. The first experiment investigates the SAM and the SAMZID approaches for change detection to demonstrate their usefulness for target detection of moving objects within the recorded scene. The goal is to eliminate unwanted changes like shadow areas. The second experiment evaluates the detection of different camouflage net types over two days. This includes camouflage nets in shadows during one flight and brightly illuminated in another due to varying solar elevation angles during the day. We demonstrate the performance of typical hyperspectral target detection and classification approaches for robust detection under these conditions. Finally, the third experiment aims to detect objects and materials behind the cover of camouflage nets by using a camouflage garage. We show that some materials can be detected using an unmixing approach.
Hyperspectral target detection experiments under nonideal conditions are scarce. An extensive multi-scale and multi-temporal field experiment was designed towards the goal of knowledge expansion under such circumstances. A range of camouflage materials and specific targets of interest were placed in a realistic natural environment with vegetation cover and varying illumination. In several experiments, aspects like changes in the sun position, variable moisture, and relocations of targets were analysed. Using an aircraft-based and a drone-based imaging spectrometer, the target scenarios were mapped at different daytimes. The data were radiometrically, atmospherically and geometrically processed to allow subsequent data analysis. First insights deliver promising results.
The separation of crop types is essential for many agricultural applications, particularly when within-season information is required. Generally, remote sensing may provide timely information with varying accuracy over the growing season, but in small structured agricultural areas, a very high spatial resolution may be needed that exceeds current satellite capabilities. This paper presents an experiment using spectral and textural features of NIR-red-green-blue (NIR-RGB) bands data sets acquired with an unmanned aerial vehicle (UAV). The study area is located in the Swiss Plateau, which has highly fragmented and small structured agricultural fields. The observations took place between May 5 and September 29, 2015 over 11 days. The analyses are based on a random forest (RF) approach, predicting crop separation metrics of all analyzed crops. Three temporal windows of observations based on accumulated growing degree days ( AGDD) were identified: an early temporal window (515-1232 AGDD, 5 May-17 June 2015) with an average accuracy (AA) of 70-75%; a mid-season window (1362 AGDD, 25 June-22 July 2015 with an AA of around 80%; and a late window (2626-3238 AGDD, 21 August-29 September 2015 with an AA of <65%. Therefore, crop separation is most promising in the mid-season window, and an additional NIR band increases the accuracy significantly. However, discrimination of winter crops is most effective in the early window, adding further observational requirements to the first window.
Crop species separation is essential for a wide range of agricultural applications—in particular, when seasonal information is needed. In general, remote sensing can provide such information with high accuracy, but in small structured agricultural areas, very high spatial resolution data (VHR) are required. We present a study involving spectral and textural features derived from near-infrared (NIR) Red Green Blue (NIR-RGB) band datasets, acquired using an unmanned aerial vehicle (UAV), and an imaging spectroscopy (IS) dataset acquired by the Airborne Prism EXperiment (APEX). Both the single usage and combination of these datasets were analyzed using a random forest-based method for crop separability. In addition, different band reduction methods based on feature factor loading were analyzed. The most accurate crop separation results were achieved using both the IS dataset and the two combined datasets with an average accuracy (AA) of >92%. In addition, we conclude that, in the case of a reduced number of IS features (i.e., wavelengths), the accuracy can be compensated by using additional NIR-RGB texture features (AA > 90%).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.