IGARSS 2019 - 2019 IEEE International Geoscience and Remote Sensing Symposium 2019
DOI: 10.1109/igarss.2019.8898223
|View full text |Cite
|
Sign up to set email alerts
|

Fusing Multi-Seasonal Sentinel-2 Images with Residual Convolutional Neural Networks for Local Climate Zone-Derived Urban Land Cover Classification

Abstract: This paper proposes a framework to fuse multi-seasonal Sentinel-2 images, with application on LCZ-derived urban land cover classification. Cross-validation over a seven-city study area in central Europe demonstrates its consistently better performance over several previous approaches, with the same experimental setup. Based on our previous work, we can conclude that decision-level fusion is better than feature-level fusion for similar tasks at similar scale with multi-seasonal Sentinel-2 images. With the frame… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 11 publications
0
4
0
Order By: Relevance
“…Apart from the dataset to train the network, in the prediction phase, multisource multitemporal data fusion is also a straightforward and effective approach to further improve the obtained benchmark LCZ classification results. The effectiveness of data fusion has been shown in [18], [65], and [21]. Specifically, a final robust result can be achieved via a decision-level fusion of multiple predictions that are obtained from multisource data, such as SAR and hyperspectral image, with same or different classifiers [66]- [70].…”
Section: Discussionmentioning
confidence: 99%
“…Apart from the dataset to train the network, in the prediction phase, multisource multitemporal data fusion is also a straightforward and effective approach to further improve the obtained benchmark LCZ classification results. The effectiveness of data fusion has been shown in [18], [65], and [21]. Specifically, a final robust result can be achieved via a decision-level fusion of multiple predictions that are obtained from multisource data, such as SAR and hyperspectral image, with same or different classifiers [66]- [70].…”
Section: Discussionmentioning
confidence: 99%
“…Apart from the dataset to train the network, in the prediction phase, multi-source multi-temporal data fusion is also a straightforward and effective approach to further improve the obtained benchmark LCZ classification results. The effectiveness of data fusion has been shown in [18], [65], and [21]. Specifically, a final robust result can be achieved via a decision-level fusion of multiple predictions that are obtained from multi-source data, such as SAR and hyperspectral image, with same or different classifiers [66], [67], [68], [69], [70].…”
Section: Discussionmentioning
confidence: 99%
“…The key to efficient large-scale LCZ classification is developing advanced machine learning models with high generalization ability [14], [15]. In this regard, tailoring deep learningbased approaches to the peculiarities of remote sensing data is one important strategy that has gained much attention recently [16], [17], [18], [19], [20], [21], [22]. A review of these published studies tells us that deep learning, specifically in the form of convolutional neural networks (CNNs), is indeed able to enhance LCZ classification accuracy given a proper dataset due to its powerful feature representation capacity, when compared to random forest approaches [23].…”
Section: Introductionmentioning
confidence: 99%
“…For the city of Cologne, these data are derived from satellite data (Sentinel 2-Data). Detailed information on the used framework to identify the LCZs can be found in Qui et al [39]. The LCZ-land use classes, originally introduced by Stewart and Oke [13] to analyze the impact of the built-up morphology on the climate of urban areas, describe the degree of built-up areas in a defined area.…”
Section: Datamentioning
confidence: 99%