2018
DOI: 10.1007/978-3-030-01249-6_16
|View full text |Cite
|
Sign up to set email alerts
|

Fast, Accurate, and Lightweight Super-Resolution with Cascading Residual Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
768
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 891 publications
(854 citation statements)
references
References 31 publications
2
768
0
Order By: Relevance
“…For lightweight networks, Hui et al [11] developed the information distillation network for better exploiting hierarchical features by separation processing of the current feature maps. And Ahn [2] designed an architecture that implemented a cascading mechanism on a residual network to boost the performance.…”
Section: Single Image Super-resolutionmentioning
confidence: 99%
See 3 more Smart Citations
“…For lightweight networks, Hui et al [11] developed the information distillation network for better exploiting hierarchical features by separation processing of the current feature maps. And Ahn [2] designed an architecture that implemented a cascading mechanism on a residual network to boost the performance.…”
Section: Single Image Super-resolutionmentioning
confidence: 99%
“…We evaluate the performance of the super-resolved images using two metrics, including peak signalto-noise ratio (PSNR) and structure similarity index (SSIM) [30]. As with existing works [2,11,12,18,24,36,38], we calculate the values on the luminance channel (i.e., Y channel of the YCbCr channels converted from the RGB channels). Additionally, for any/unknown scale factor experiments, we use RealSR dataset from NTIRE2019 Real Super-Resolution Challenge 1 .…”
Section: Experiments 41 Datasets and Metricsmentioning
confidence: 99%
See 2 more Smart Citations
“…A number of neural network-based SR models were proposed. According to different model designs, these models can be categorized into linear models such as SRCNN [9] and VDSR [10], residual models like CARN [11] and REDNet [12], recursive models like DRCN [13] and DRRN [14], densely connected models RDN [15] and D-DBPN [16], attentionbased models with SelNet [17] and RCAN [18] as examples, progressive models SCN [19] and LapSRN [20], and generative adversarial network (GAN) models EnhanceNet [21] and SRGAN [22].…”
Section: Introductionmentioning
confidence: 99%