In an increasingly interconnected world, human–environment interactions involving flows of people, organisms, goods, information, and energy are expanding in magnitude and extent, often over long distances. As a universal paradigm for examining these interactions, the telecoupling framework (published in 2013) has been broadly implemented across the world by researchers from diverse disciplines. We conducted a systematic review of the first five years of telecoupling research to evaluate the state of telecoupling science and identify strengths, areas to be improved, and promising avenues for future study. We identified 89 studies using any derivation of the term telecoupling. These works emphasize trade flows, information transfer, and species dispersal at international, national, and regional scales involving one or a few countries, with China, Brazil, and the United States being the most frequently studied countries. Our review showed a rising trend in publications and citations on telecoupling, with 63% of identified telecoupling studies using the framework’s specific language (e.g., “flows”, “agents”). This result suggests that future telecoupling studies could apply the standardized telecoupling language and terminology to better coordinate, synthesize, and operationalize interdisciplinary research. Compelling topics for future research include operationalization of the telecoupling framework, commonalities among telecouplings, telecoupling mechanisms and causality, and telecoupled systems governance. Overall, the first five years of telecoupling research have improved our understanding of human–environment interactions, laying a promising foundation for future social–ecological research in a telecoupled world.
Producing accurate crop maps during the current growing season is essential for effective agricultural monitoring. Substantial efforts have been made to study regional crop distribution from year to year, but less attention is paid to the dynamics of composition and spatial extent of crops within a season. Understanding how crops are distributed at the early developing stages allows for the timely adjustment of crop planting structure as well as agricultural decision making and management. To address this knowledge gap, this study presents an approach integrating object-based image analysis with random forest (RF) for mapping in-season crop types based on multi-temporal GaoFen satellite data with a spatial resolution of 16 meters. A multiresolution local variance strategy was used to create crop objects, and then object-based spectral/textural features and vegetation indices were extracted from those objects. The RF classifier was employed to identify different crop types at four crop growth seasons by integrating available features. The crop classification performance of different seasons was assessed by calculating F-score values. Results show that crop maps derived using seasonal features achieved an overall accuracy of more than 87%. Compared to the use of spectral features, a feature combination of in-season textures and multi-temporal spectral and vegetation indices performs best when classifying crop types. Spectral and temporal information is more important than texture features for crop mapping. However, texture can be essential information when there is insufficient spectral and temporal information (e.g., crop identification in the early spring). These results indicate that an object-based image analysis combined with random forest has considerable potential for in-season crop mapping using high spatial resolution imagery.
Accurate crop distribution maps provide important information for crop censuses, yield monitoring and agricultural insurance assessments. Most existing studies apply low spatial resolution satellite images for crop distribution mapping, even in areas with a fragmented landscape. Unmanned aerial vehicle (UAV) imagery provides an alternative imagery source for crop mapping, yet its spectral resolution is usually lower than satellite images. In order to produce more accurate maps without losing any spatial heterogeneity (e.g., the physical boundary of land parcel), this study fuses Sentinel-2A and UAV images to map crop distribution at a finer spatial scale (i.e., land parcel scale) in an experimental site with various cropping patterns in Heilongjiang Province, Northeast China. Using a random forest algorithm, the original, as well as the fused images, are classified into 10 categories: rice, corn, soybean, buckwheat, other vegetations, greenhouses, bare land, water, roads and houses. In addition, we test the effect of UAV image choice by fusing Sentinel-2A with different UAV images at multiples spatial resolutions: 0.03 m, 0.10 m, 0.50 m, 1.00 m and 3.00 m. Overall, the fused images achieved higher classification accuracies, ranging between 10.58% and 16.39%, than the original images. However, the fused image based on the finest UAV image (i.e., 0.03 m) does not result in the highest accuracy. Instead, the 0.10 m spatial resolution UAV image produced the most accurate map. When the spatial resolution is less than 0.10 m, accuracy decreases gradually as spatial resolution decreases. The results of this paper not only indicate the possibility of combining satellite images and UAV images for land parcel level crop mapping for fragmented landscapes, but it also implies a potential scheme to exploit optimal choice of spatial resolution in fusing UAV images and Sentinel-2A, with little to no adverse side-effects.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.