Groundwater level (GWL) and depth to water (DTW) are related metrics aimed at characterizing groundwater-table positions in peatlands, and two of the most common variables collected by researchers working in these ecosystems. While well-established field techniques exist for measuring GWL and DTW, they are generally difficult to scale. In this study, we present a novel workflow for mapping groundwater using orthophotography and photogrammetric point clouds acquired from unmanned aerial vehicles. Our approach takes advantage of the fact that pockets of surface water are normally abundant in peatlands, which we assume to be reflective of GWL in these porous, gently sloping environments. By first classifying surface water and then extracting a sample of water elevations, we can generate continuous models of GWL through interpolation. Estimates of DTW can then be obtained through additional efforts to characterize terrain. We demonstrate our methodology across a complex, 61-ha treed bog in northern Alberta, Canada. An independent accuracy assessment using 31 temporally coincident water-well measurements revealed accuracies (root mean square error) in the 20-cm range, though errors were concentrated in small upland pockets in the study area, and areas of dense tree covers. Model estimates in the open peatland areas were considerably better.
Peatlands are globally significant sources of atmospheric methane (CH 4 ). In the northern hemisphere, extensive geologic exploration activities have occurred to map petroleum deposits. In peatlands, these activities result in soil compaction and wetter conditions, changes that are likely to enhance CH 4 emissions. To date, this effect has not been quantified. Here we map petroleum exploration disturbances on peatlands in Alberta, Canada, where peatlands and oil deposits are widespread. We then estimate induced CH 4 emissions. By our calculations, at least 1900 km 2 of peatland have been affected, increasing CH 4 emissions by 4.4–5.1 kt CH 4 yr −1 above undisturbed conditions. Not currently estimated in Canada’s national reporting of greenhouse gas (GHG) emissions, inclusion would increase current emissions from land use, land use change and forestry by 7–8%. However, uncertainty remains large. Research further investigating effects of petroleum exploration on peatland GHG fluxes will allow appropriate consideration of these emissions in future peatland management.
Peatlands are globally significant stores of soil carbon, where local methane (CH4) emissions are strongly linked to water table position and microtopography. Historically, these factors have been difficult to measure in the field, constraining our capacity to observe local patterns of variability. In this paper, we show how remote sensing surveys conducted from unmanned aerial vehicle (UAV) platforms can be used to map microtopography and depth to water over large areas with good accuracy, paving the way for spatially explicit estimates of CH4 emissions. This approach enabled us to observe—for the first time—the effects of low‐impact seismic lines (LIS; petroleum exploration corridors) on surface morphology and CH4 emissions in a treed‐bog ecosystem in northern Alberta, Canada. Through compaction, LIS lines were found to flatten the observed range in microtopographic elevation by 46 cm and decrease mean depth to water by 15.4 cm, compared to surrounding undisturbed conditions. These alterations are projected to increase CH4 emissions by 20–120% relative to undisturbed areas in our study area, which translates to a total rise of 0.011–0.027 kg CH4 day−1 per linear kilometer of LIS (~2 m wide). The ~16 km of LIS present at our 61 ha study site were predicted to boost CH4 emissions by 20–70 kg between May and September 2016.
Microtopographic variability in peatlands has a strong influence on greenhouse gas fluxes, but we lack the ability to characterize terrain in these environments efficiently over large areas. To address this, we assessed the capacity of photogrammetric data acquired from an unmanned aerial vehicle (UAV or drone) to reproduce ground elevations measured in the field. In particular, we set out to evaluate the role of (i) vegetation/surface complexity and (ii) supplementary LiDAR data on results. We compared remote-sensing observations to reference measurements acquired with survey grade GPS equipment at 678 sample points, distributed across a 61-hectare treed bog in northwestern Alberta, Canada. UAV photogrammetric data were found to capture elevation with accuracies, by root mean squares error, ranging from 14-42 cm, depending on the state of vegetation/surface complexity. We judge the technology to perform well under all but the most-complex conditions, where ground visibility is hindered by thick vegetation. Supplementary LiDAR data did not improve results significantly, nor did it perform well as a stand-alone technology at the low densities typically available to researchers.
Lichen is an important food source for caribou in Canada. Lichen mapping using remote sensing (RS) images could be a challenging task, however, as lichens generally appear in unevenly distributed, small patches, and could resemble surficial features. Moreover, collecting lichen labeled data (reference data) is expensive, which restricts the application of many robust supervised classification models that generally demand a large quantity of labeled data. The goal of this study was to investigate the potential of using a very-high-spatial resolution (1-cm) lichen map of a small sample site (e.g., generated based on a single UAV scene and using field data) to train a subsequent classifier to map caribou lichen over a much larger area (~0.04 km2 vs. ~195 km2) and a lower spatial resolution image (in this case, a 50-cm WorldView-2 image). The limited labeled data from the sample site were also partially noisy due to spatial and temporal mismatching issues. For this, we deployed a recently proposed Teacher-Student semi-supervised learning (SSL) approach (based on U-Net and U-Net++ networks) involving unlabeled data to assist with improving the model performance. Our experiments showed that it was possible to scale-up the UAV-derived lichen map to the WorldView-2 scale with reasonable accuracy (overall accuracy of 85.28% and F1-socre of 84.38%) without collecting any samples directly in the WorldView-2 scene. We also found that our noisy labels were partially beneficial to the SSL robustness because they improved the false positive rate compared to the use of a cleaner training set directly collected within the same area in the WorldView-2 image. As a result, this research opens new insights into how current very high-resolution, small-scale caribou lichen maps can be used for generating more accurate large-scale caribou lichen maps from high-resolution satellite imagery.
Shadows from buildings, terrain, and other elevated features represent lost and (or) impaired data values that hinder the quality of optical images acquired under all but the most diffuse illumination conditions. This is particularly problematic in high-spatial-resolution imagery acquired from unmanned aerial vehicles (UAVs), which generally operate very close to the ground. However, the flexibility and low cost of re-deployment of the platform also presents opportunities, which we capitalize on in a new workflow designed to eliminate shadows from UAV-based orthomosaics. Our straightforward, three-step procedure relies on images acquired from two different UAV flights, where illumination conditions produce diverging shadow orientations: one before solar noon and another after. From this multi-temporal image stack, we first identify and then eliminate shadows from individual orthophoto components, then construct the final orthomosaic using a feature-matching strategy with the commercial software package Photoscan. The utility of our strategy is demonstrated over a treed-wetland study site in northwestern Alberta, Canada; a complex scene containing a wide variety of shadows, which our workflow effectively eliminated. While shadow-reduced orthomosaics are generally less useful for feature-identification tasks that rely on the shadow element of image interpretation, they create a superior foundation for most other image-processing routines, including classification and change-detection.
Relating ground photographs to UAV orthomosaics is a key linkage required for accurate multi-scaled lichen mapping. Conventional methods of multi-scaled lichen mapping, such as random forest models and convolutional neural networks, heavily rely on pixel DN values for classification. However, the limited spectral range of ground photos requires additional characteristics to differentiate lichen from spectrally similar objects, such as bright logs. By applying a neural network to tiles of a UAV orthomosaics, additional characteristics, such as surface texture and spatial patterns, can be used for inferences. Our methodology used a neural network (UAV LiCNN) trained on ground photo mosaics to predict lichen in UAV orthomosaic tiles. The UAV LiCNN achieved mean user and producer accuracies of 85.84% and 92.93%, respectively, in the high lichen class across eight different orthomosaics. We compared the known lichen percentages found in 77 vegetation microplots with the predicted lichen percentage calculated from the UAV LiCNN, resulting in a R2 relationship of 0.6910. This research shows that AI models trained on ground photographs effectively classify lichen in UAV orthomosaics. Limiting factors include the misclassification of spectrally similar objects to lichen in the RGB bands and dark shadows cast by vegetation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.