2018
DOI: 10.1109/jstars.2018.2805923
|View full text |Cite
|
Sign up to set email alerts
|

Remote Sensing Image Fusion With Deep Convolutional Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
122
0
2

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 286 publications
(132 citation statements)
references
References 51 publications
0
122
0
2
Order By: Relevance
“…Recently, a number of DL-based pansharpening methods were proposed by exploiting different network structures [15,22,42,43,29,30,32]. These methods can be easily adapted to MS/HS fusion problem.…”
Section: Deep Learning Based Methodsmentioning
confidence: 99%
“…Recently, a number of DL-based pansharpening methods were proposed by exploiting different network structures [15,22,42,43,29,30,32]. These methods can be easily adapted to MS/HS fusion problem.…”
Section: Deep Learning Based Methodsmentioning
confidence: 99%
“…Because deep learning can automatically learn additional features from various types of data, it has gained considerable attention in recent years [17][18][19][20][21]. Different from the conventional pansharpening methods, deep learning-based methods present more ideal solutions for improving the performance of pansharpening.…”
Section: Introductionmentioning
confidence: 99%
“…To obtain the high quality fused image, Wei et al [19] presented a deep residual network (ResNet) for pansharpening. Shao et al [20] developed a two-branch network that can separately obtain salient features from MS and PAN images. A multiscale and multidepth convolutional neural network (CNN) for pansharpening was introduced by Yuan et al [12].…”
Section: Introductionmentioning
confidence: 99%
“…One of the key benefits of DL is shaping the hierarchical representations by getting abnormal, that is, high‐level state, features from low‐level feature 19 . At long last, the fused image is acquired by utilizing both the choice guide and the source images 20 . The inverse transform is performed on fused sub bands to yield the fusion information at the last stage 21 …”
Section: Introductionmentioning
confidence: 99%