“…Many studies report the problems and limitations associated with different fusion techniques (Chavez et al, 1991;Wald and Ranchin, 1997;Zhang, 2002). The most frequently encountered problem in fusion algorithms is that the fused image exhibits a notable deviation in visual appearance and spectral values from the original MS image (Ling et al, 2007;Kalpoma and Kudoh, 2007). Spectral distortions including spatial artifacts affect both manual and automated classifications because any error in the synthesis of the spectral signatures at the highest spatial resolution incurs an error in the decision (Ranchin et al, 2003).…”
Section: Image Fusion and Quality Assessmentmentioning
a b s t r a c tRemote sensing is a rapidly developing tool for mapping the abundance and distribution of Antarctic wildlife. While both panchromatic and multispectral imagery have been used in this context, image fusion techniques have received little attention. We tasked seven widely-used fusion algorithms: Ehlers fusion, hyperspherical color space fusion, high-pass fusion, principal component analysis (PCA) fusion, University of New Brunswick fusion, and wavelet-PCA fusion to resolution enhance a series of single-date QuickBird-2 and Worldview-2 image scenes comprising penguin guano, seals, and vegetation. Fused images were assessed for spectral and spatial fidelity using a variety of quantitative quality indicators and visual inspection methods. Our visual evaluation elected the high-pass fusion algorithm and the University of New Brunswick fusion algorithm as best for manual wildlife detection while the quantitative assessment suggested the Gram-Schmidt fusion algorithm and the University of New Brunswick fusion algorithm as best for automated classification. The hyperspherical color space fusion algorithm exhibited mediocre results in terms of spectral and spatial fidelities. The PCA fusion algorithm showed spatial superiority at the expense of spectral inconsistencies. The Ehlers fusion algorithm and the wavelet-PCA algorithm showed the weakest performances. As remote sensing becomes a more routine method of surveying Antarctic wildlife, these benchmarks will provide guidance for image fusion and pave the way for more standardized products for specific types of wildlife surveys. Ó
“…Many studies report the problems and limitations associated with different fusion techniques (Chavez et al, 1991;Wald and Ranchin, 1997;Zhang, 2002). The most frequently encountered problem in fusion algorithms is that the fused image exhibits a notable deviation in visual appearance and spectral values from the original MS image (Ling et al, 2007;Kalpoma and Kudoh, 2007). Spectral distortions including spatial artifacts affect both manual and automated classifications because any error in the synthesis of the spectral signatures at the highest spatial resolution incurs an error in the decision (Ranchin et al, 2003).…”
Section: Image Fusion and Quality Assessmentmentioning
a b s t r a c tRemote sensing is a rapidly developing tool for mapping the abundance and distribution of Antarctic wildlife. While both panchromatic and multispectral imagery have been used in this context, image fusion techniques have received little attention. We tasked seven widely-used fusion algorithms: Ehlers fusion, hyperspherical color space fusion, high-pass fusion, principal component analysis (PCA) fusion, University of New Brunswick fusion, and wavelet-PCA fusion to resolution enhance a series of single-date QuickBird-2 and Worldview-2 image scenes comprising penguin guano, seals, and vegetation. Fused images were assessed for spectral and spatial fidelity using a variety of quantitative quality indicators and visual inspection methods. Our visual evaluation elected the high-pass fusion algorithm and the University of New Brunswick fusion algorithm as best for manual wildlife detection while the quantitative assessment suggested the Gram-Schmidt fusion algorithm and the University of New Brunswick fusion algorithm as best for automated classification. The hyperspherical color space fusion algorithm exhibited mediocre results in terms of spectral and spatial fidelities. The PCA fusion algorithm showed spatial superiority at the expense of spectral inconsistencies. The Ehlers fusion algorithm and the wavelet-PCA algorithm showed the weakest performances. As remote sensing becomes a more routine method of surveying Antarctic wildlife, these benchmarks will provide guidance for image fusion and pave the way for more standardized products for specific types of wildlife surveys. Ó
“…One common assumption in some model-based and CS methods is that the Pan image X is a linear combination of HR MS images Z [54,[58][59][60][61][62][63][64][65][66] as in Equation (5). Equation (16) solves for these methods.…”
Section: Cs Methods From a Bayesian Perspectivementioning
Component substitution (CS) and multi-resolution analysis (MRA) are the two basic categories in the extended general image fusion (EGIF) framework for fusing panchromatic (Pan) and multispectral (MS) images. Despite of the method diversity, there are some unaddressed questions and contradictory conclusions about fusion. For example, is the spatial enhancement of CS methods better than MRA methods? Is spatial enhancement and spectral preservation competitive? How to achieve spectral consistency defined by Wald et al. in 1997? In their definition any synthetic image should be as identical as possible to the original image once degraded to its original resolution. To answer these questions, this research first finds out that all the CS and MRA methods can be derived from the Bayesian fusion method by adjusting a weight parameter to balance contributions from the spatial injection and spectral preservation models. The spectral preservation model assumes a Gaussian distribution of the desired high-resolution MS images, with the up-sampled low-resolution MS images comprising the mean value. The spatial injection model assumes a linear correlation between Pan and MS images. Thus the spatial enhancement depends on the weight parameter but is irrelevant of which category (i.e., MRA or CS) the method belongs to. This paper then adds a spectral consistency model in the Bayesian fusion framework to guarantee Wald's spectral consistency with
OPEN ACCESSRemote Sens. 2015, 7 6829 regard to arbitrary sensor point spread function. Although the spectral preservation in the EGIF methods is competitive to spatial enhancement, the Wald's spectral consistency property is complementary with spatial enhancement. We conducted experiments on satellite images acquired by the QuickBird and WorldView-2 satellites to confirm our analysis, and found that the performance of the traditional EGIF methods improved significantly after adding the spectral consistency model.
“…Information from multiple images covering the same scene can be fused at the feature level, but calibrating intensities of images acquired at different time points is difficult [19], [20], as different noise processes overlap [21]. It is computationally expensive [22], [23] and requires, ideally, some knowledge about the sensor [20]. Thus, a typical approach is to extract the information of interest in a first step, for example following a pattern classification approach, and then to fuse the information across observations in a second step, for example, by averaging the probabilistic maps or by assigning the vote of the majority of the observations [24], [25], [26], [27], [28].…”
Abstract-We evaluate and further develop a multitemporal fusion strategy that we use to detect the location of ancient settlement sites in the Near East and to map their distribution, a spatial pattern that remains static over time. For each ASTER images that has been acquired in our survey area in north-eastern Syria, we use a pattern classification strategy to map locations with a multispectral signal similar to the one from (few) known archaeological sites nearby. We obtain maps indicating the presence of anthrosol -soils that formed in the location of ancient settlements and that have a distinct spectral pattern under certain environmental conditions -and find that pooling the probability maps from all available time points reduces the variance of the spatial anthrosol pattern significantly. Removing biased classification mapsi.e. those that rank last when comparing the probability maps with the (limited) ground truth we have -reduces the overall prediction error even further, and we estimate optimal weights for each image using a non-negative least squares regression strategy. The ranking and pooling strategy approach we propose in this study shows a significant improvement over the plain averaging of anthrosol probability maps that we used in an earlier attempt to map archaeological sites in a 20 000 km 2 area in northern Mesopotamia, and we expect it to work well in other surveying tasks that aim at mapping static surface patterns with limited ground truth in long series of multispectral images.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.