The main objective of this article is quality assessment of pansharpening fusion methods. Pansharpening is a fusion technique to combine a panchromatic image of high spatial resolution with multispectral image data of lower spatial resolution to obtain a high-resolution multispectral image. During this process, the significant spectral characteristics of the multispectral data should be preserved. For images acquired at the same time by the same sensor, most algorithms for pansharpening provide very good results, i.e. they retain the high spatial resolution of the panchromatic image and the spectral information from the multispectral image (single-sensor, single-date fusion). For multi-date, multi-sensor fusion, however, these techniques can still create spatially enhanced data sets, but usually at the expense of the spectral consistency. In this study, eight different methods are compared for image fusion to show their ability to fuse multitemporal and multi-sensor image data. A series of eight multitemporal multispectral remote sensing images is fused with a panchromatic Ikonos image and a TerraSAR-X radar image as a panchromatic substitute. The fused images are visually and quantitatively analysed for spectral characteristics preservation and spatial improvement. It can not only be proven that the Ehlers fusion is superior to all other tested algorithms, it is also the only method that guarantees excellent colour preservation for all dates and sensors used in this study.
Abstract:The transition from film imaging to digital imaging in photogrammetric data capture is opening interesting possibilities for photogrammetric processes. A great advantage of digital sensors is their radiometric potential. This article presents a state-of-the-art review on the radiometric aspects of digital photogrammetric images. The analysis is based on a literature research and a questionnaire submitted to various interest groups related to the photogrammetric process. An important contribution to this paper is a characterization of the photogrammetric image acquisition and image product generation systems. The questionnaire revealed many weaknesses in current processes, but the future prospects of radiometrically quantitative photogrammetry are promising.
Image fusion is a technique that is used to combine the spatial structure of a high-resolution panchromatic image with the spectral information of a low-resolution multispectral image to produce a high-resolution multispectral image. Currently, image fusion techniques via color or statistical transforms such as the Intensity-HueSaturation (IHS) and principal component (PC) methods are still widely used. These methods create multispectral images of higher spatial resolution but usually at the cost of color distortions in the fused images. This is especially true if the wavelength range of the panchromatic image does not correspond to that of the employed multispectral bands or for multitemporal/multisensoral fusion. To overcome the color distortion problem, a number of new fusion methods have been developed over the last years. One of these is the Ehlers fusion algorithm, which is based on an IHS transform coupled with adaptive filtering in the Fourier domain. This method preserves the spectral characteristics of the lower spatial resolution multispectral images for single-sensor, multi-sensor, and multi-temporal fusion. A comparison between this method and three sophisticated new fusion techniques that are available in commercial image processing software is presented in this paper using multitemporal multi-sensor fusion with SPOT multispectral and Ikonos panchromatic datasets as well as single-sensor single-date multispectral and panchromatic Quickbird data. The fused images are compared visually and with statistical methods that are objective, reproducible, and quantitative.It can be shown that the sophisticated methods such as Gram Schmidt fusion, CN spectral sharpening, and the modified IHS provide good results in color preservation for single sensor fusion. For multi-temporal multi-sensor fusion, however, these methods produce significant changes in spectral characteristics for the fused datasets. This is not the case for the Ehlers fusion algorithm, which shows no recognizable color distortion even for multi-temporal and multi-sensor datasets.
This paper describes the results of a new combined method that consists of a cooperative approach of several different algorithms for automated change detection. These methods are based on isotropic frequency filtering, spectral and texture analysis, and segmentation. For the frequency analysis, different band pass filters are applied to identify the relevant frequency information for change detection. After transforming the multitemporal images using a fast Fourier transform and applying the most suitable band pass filter to extract changed structures, we apply an edge detection algorithm in the spatial domain. For the texture analysis, we calculate the parameters energy and homogeneity for the multitemporal datasets. Then a principal component analysis is applied to the new multispectral texture images and subtracted to get the texture change information. This method can be combined with spectral information and prior segmentation of the image data as well as with morphological operations for a final binary change result. A rule-based combination of the change algorithms is applied to calculate the probability of change for a particular location. This Combined Edge Segment Texture (CEST) method was tested with high-resolution remote-sensing images of the crisis area in Darfur (Sudan). Our results were compared with several standard algorithms for automated change detection, such as image difference, image ratio, principal component analysis, multivariate alteration detection (MAD) and post classification change detection. CEST showed superior accuracy compared to standard methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.