2018
DOI: 10.3390/ijgi7100401
|View full text |Cite
|
Sign up to set email alerts
|

Fusion of SAR and Multispectral Images Using Random Forest Regression for Change Detection

Abstract: In order to overcome the insufficiency of single remote sensing data in change detection, synthetic aperture radar (SAR) and optical image data can be used together for supplementation. However, conventional image fusion methods fail to address the differences in imaging mechanisms and cannot overcome some practical limitations such as usage in change detection or temporal requirement of the optical image. This study proposes a new method to fuse SAR and optical images, which is expected to be visually helpful… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
24
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
9

Relationship

1
8

Authors

Journals

citations
Cited by 51 publications
(26 citation statements)
references
References 38 publications
(58 reference statements)
0
24
0
Order By: Relevance
“…A single pixel does not contain enough information and, therefore, features other than the pixel value must be considered to perform the phenological normalization [28,38,39]. The spectral indices are considered as features in this study, which is influenced by phenological properties [39][40][41].…”
Section: Selection Of the Spectral Indexmentioning
confidence: 99%
See 1 more Smart Citation
“…A single pixel does not contain enough information and, therefore, features other than the pixel value must be considered to perform the phenological normalization [28,38,39]. The spectral indices are considered as features in this study, which is influenced by phenological properties [39][40][41].…”
Section: Selection Of the Spectral Indexmentioning
confidence: 99%
“…Changes that are caused by vegetation in particular have the most typical characteristics, including nonlinearity, which induces serious disturbances when change detection is performed [25,26]. In addition, since optical satellite imagery, which is the main source in change detection, is affected by clouds and atmospheric conditions, it is difficult to acquire images that meet the temporal requirements [27,28]. In such cases, images with phenological differences as well as radiometric differences should be utilized [25,26].…”
mentioning
confidence: 99%
“…• Random Forest Regression [29]: applied individually to a RGB multispectral image and to a SAR image pre-processed by gray-level co-occurrence matrix descriptors (e.g., energy, homogeneity, angular second moment), to train independent classes by K-means, then training samples are selected and used in the learning process, determining the RF for each class, then applying the resulting RF to the entire image. This results in the fusion of thesurface roughness characteristics of the SAR image and the spectral characteristics of the MS image; • Two-Branch Convolutional Neural Network [30]: CNN layers are applied to the respective input features: a principal component analysis (PCA) is performed on an hyper-spectral 144-bands image (branch 1-image provided at the IEEE-GRSS 2013 data fusion contest), then, in next layers, a series of filters whose parameters are tuned by supervised learning.…”
Section: What Does This Special Issue Bring To the Reader?mentioning
confidence: 99%
“…Pohl and Van Genderen [43], Zhang [44], and Zhu et al [45] catalogued and reviewed various approaches for satellite image fusion such as intensity-hue-saturation [46], principal component analysis [47], wavelet decomposition [48], high-pass filter (HPF) [49], sparse representation [50] and area-to-point regression kriging (ATPRK) methods [51]. Recently, machine learning techniques such as deep learning [52] and random forest (RF) [53] have gained popularity in satellite image fusion. Considering its advantages and past performance [54], an RF algorithm is proposed here for the multi-sensor data alignment in the virtual constellation of L8, S2A, and ASTER.…”
Section: Introductionmentioning
confidence: 99%