2013
DOI: 10.1109/jstars.2013.2245860
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Modal and Multi-Temporal Data Fusion: Outcome of the 2012 GRSS Data Fusion Contest

Abstract: The 2012 Data Fusion Contest organized by the Data Fusion Technical Committee (DFTC) of the IEEE Geoscience and Remote Sensing Society (GRSS) aimed at investigating the potential use of very high spatial resolution (VHR) multi-modal/multi-temporal image fusion. Three different types of data sets, including spaceborne multi-spectral, spaceborne synthetic aperture radar (SAR), and airborne light detection and ranging (LiDAR) data collected over the downtown San Francisco area were distributed during the Contest.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
35
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 82 publications
(35 citation statements)
references
References 91 publications
0
35
0
Order By: Relevance
“…Nonetheless, the resolution of the imagery and the heterogeneity characteristic of urban landscapes make it difficult to automatically map detailed urban lands solely using optical remote sensing methods (Cockx, Voorde, & Canters, 2014). The use of ancillary datasets such as census data, road networks, impervious surface coverages, landscape metrics, land parcel attributes, and radar data were recently documented to improve urban classifications (Abed & Kaysi, 2003;Berger et al, 2013;Chaudhry & Mackaness, 2008;Hermosilla, Palomar-V azquez, Balaguer-Beser, Balsa-Barreiro, & Ruiz, 2014;Schneider, Friedl, & Potere, 2014;Soergel, 2010;Wu, Qiu, Usery, & Wang, 2009). Fractal methods have also been documented as a successful component in aiding in locating UBs (Tannier & Thomas, 2013;Tannier, Thomas, Vuidel, & Frankhauser, 2011).…”
Section: Introductionmentioning
confidence: 99%
“…Nonetheless, the resolution of the imagery and the heterogeneity characteristic of urban landscapes make it difficult to automatically map detailed urban lands solely using optical remote sensing methods (Cockx, Voorde, & Canters, 2014). The use of ancillary datasets such as census data, road networks, impervious surface coverages, landscape metrics, land parcel attributes, and radar data were recently documented to improve urban classifications (Abed & Kaysi, 2003;Berger et al, 2013;Chaudhry & Mackaness, 2008;Hermosilla, Palomar-V azquez, Balaguer-Beser, Balsa-Barreiro, & Ruiz, 2014;Schneider, Friedl, & Potere, 2014;Soergel, 2010;Wu, Qiu, Usery, & Wang, 2009). Fractal methods have also been documented as a successful component in aiding in locating UBs (Tannier & Thomas, 2013;Tannier, Thomas, Vuidel, & Frankhauser, 2011).…”
Section: Introductionmentioning
confidence: 99%
“…Evaluating the spatial and spectral details of images after applying an image fusion technique is an important step [51]. Guided filtration process for image fusion is presented in [52].Multimodal and multitemporal image fusion method can be analyzed in [53]. ICA independent component analysis approach can also be utilized for performing a fusion of sequence of images [54].…”
Section: Image Fusion Systemsmentioning
confidence: 99%
“…In Bayesian-based spectral change detection method, the key dates and periods were missed from the record due to the climatic conditions. Berger et al [23] took the SAR and air bone Light Detection and Ranging (LiDAR) data to improve the surface reluctance retrievals of the optical data. They also finally demonstrated the usefulness of LiDAR fused with optical data.…”
Section: Related Workmentioning
confidence: 99%
“…The image fusion performance (I) between input images (A, B) to F with pixels (p) is given by ( , ) = ∑ ( , ) ( , ) ( ) ( ) , (23) ( , ) = ∑ ( , ) ( , ) ( ) ( ) , (24) The MI between the input images ( ) and F is given by…”
Section: Mutual Informationmentioning
confidence: 99%