Ecosystem structure, especially vertical vegetation structure, is one of the six essential biodiversity variable classes and is an important aspect of habitat heterogeneity, affecting species distributions and diversity by providing shelter, foraging, and nesting sites. Point clouds from airborne laser scanning (ALS) can be used to derive such detailed information on vegetation structure. However, public agencies usually only provide digital elevation models, which do not provide information on vertical vegetation structure.Calculating vertical structure variables from ALS point clouds requires extensive data processing and remote sensing skills that most ecologists do not have. However, such information on vegetation structure is extremely valuable for many analyses of habitat use and species distribution. We here propose 10 variables that should be easily accessible to researchers and stakeholders through national data portals. In addition, we argue for a consistent selection of variables and their systematic testing, which would allow for continuous improvement of such a list to keep it up-to-date with the latest evidence. This initiative is particularly needed not only to advance ecological and biodiversity research by providing valuable open datasets but also to guide potential users in the face of increasing availability of global vegetation structure products.
With the ever-improving advances in computer vision and Earth observation capabilities, Unmanned Aerial Vehicles (UAVs) allow extensive forest inventory and the description of stand structure indirectly. We performed several flights with different UAVs and popular sensors over two sites with coniferous forests of various ages and flight levels using the custom settings preset by solution suppliers. The data were processed using image-matching techniques, yielding digital surface models, which were further analyzed using the lidR package in R. Consumer-grade RGB cameras were consistently more successful in the identification of individual trees at all of the flight levels (84–77% for Phantom 4), compared to the success of multispectral cameras, which decreased with higher flight levels and smaller crowns (77–54% for RedEdge-M). Regarding the accuracy of the measured crown diameters, RGB cameras yielded satisfactory results (Mean Absolute Error—MAE of 0.79–0.99 m and 0.88–1.16 m for Phantom 4 and Zenmuse X5S, respectively); multispectral cameras overestimated the height, especially in the full-grown forests (MAE = 1.26–1.77 m). We conclude that widely used low-cost RGB cameras yield very satisfactory results for the description of the structural forest information at a 150 m flight altitude. When (multi)spectral information is needed, we recommend reducing the flight level to 100 m in order to acquire sufficient structural forest information. The study contributes to the current knowledge by directly comparing widely used consumer-grade UAV cameras and providing a clear elementary workflow for inexperienced users, thus helping entry-level users with the initial steps and supporting the usability of such data in practice.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.