2019 IEEE International Conference on Image Processing (ICIP) 2019
DOI: 10.1109/icip.2019.8803272
|View full text |Cite
|
Sign up to set email alerts
|

Improving Super Resolution Methods Via Incremental Residual Learning

Abstract: Recently, Convolutional Neural Networks (CNNs) have shown promising performance in super-resolution (SR). However, these methods operate primarily on Low Resolution (LR) inputs for memory efficiency but this limits, as we demonstrate, their ability to (i) model high frequency information; and (ii) smoothly translate from LR to High Resolution (HR) space. To this end, we propose a novel Incremental Residual Learning (IRL) framework to address these mentioned issues. In IRL, first we select a typical SR pre-trai… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 21 publications
(47 reference statements)
0
2
0
Order By: Relevance
“…In order to generate more photo-realistic results, Ledig et al further developed a GAN-based network, the SRGAN, which is a combination of SRResNet, the VGG-based perceptual loss, and a discriminator. Based on the success of SRResNet and SRGAN, many variants have been developed so far, including EDSR [20], RCAN [21], ASRResNet [8], IRL [22], and WDSRGAN [23]. EDSR eliminated the batch normalization layers and expanded the model size of SRResNet.…”
Section: Nn-based Srmentioning
confidence: 99%
See 1 more Smart Citation
“…In order to generate more photo-realistic results, Ledig et al further developed a GAN-based network, the SRGAN, which is a combination of SRResNet, the VGG-based perceptual loss, and a discriminator. Based on the success of SRResNet and SRGAN, many variants have been developed so far, including EDSR [20], RCAN [21], ASRResNet [8], IRL [22], and WDSRGAN [23]. EDSR eliminated the batch normalization layers and expanded the model size of SRResNet.…”
Section: Nn-based Srmentioning
confidence: 99%
“…Among the NN-based methods, references [8,[14][15][16][17][18][20][21][22] try to solve Eq. (1) and no image prior is utilized.…”
Section: Nn-based Srmentioning
confidence: 99%