Taking cities as objects being observed, urban remote sensing is an important branch of remote sensing. Given the complexity of the urban scenes, urban remote sensing observation requires data with a high temporal resolution, high spatial resolution, and high spectral resolution. To the best of our knowledge, however, no satellite owns all the above characteristics. Thus, it is necessary to coordinate data from existing remote sensing satellites to meet the needs of urban observation. In this study, we abstracted the urban remote sensing observation process and proposed an urban spatio-temporal-spectral observation model, filling the gap of no existing urban remote sensing framework. In this study, we present four applications to elaborate on the specific applications of the proposed model: 1) a spatiotemporal fusion model for synthesizing ideal data, 2) a spatio-spectral observation model for urban vegetation biomass estimation, 3) a temporal-spectral observation model for urban flood mapping, and 4) a spatio-temporal-spectral model for impervious surface extraction. We believe that the proposed model, although in a conceptual stage, can largely benefit urban observation by providing a new data fusion paradigm.
Optical and Synthetic Aperture Radar (SAR) fusion is addressed in this paper. Intensity–Hue–Saturation (IHS) is an easily implemented fusion method and can separate Red–Green–Blue (RGB) images into three independent components; however, using this method directly for optical and SAR images fusion will cause spectral distortion. The Gradient Transfer Fusion (GTF) algorithm is proposed firstly for infrared and gray visible images fusion, which formulates image fusion as an optimization problem and keeps the radiation information and spatial details simultaneously. However, the algorithm assumes that the spatial details only come from one of the source images, which is inconsistent with the actual situation of optical and SAR images fusion. In this paper, a fusion algorithm named IHS-GTF for optical and SAR images is proposed, which combines the advantages of IHS and GTF and considers the spatial details from the both images based on pixel saliency. The proposed method was assessed by visual analysis and ten indices and was further tested by extracting impervious surface (IS) from the fused image with random forest classifier. The results show the good preservation of spatial details and spectral information by our proposed method, and the overall accuracy of IS extraction is 2% higher than that of using optical image alone. The results demonstrate the ability of the proposed method for fusing optical and SAR data effectively to generate useful data.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.