2021
DOI: 10.1016/j.jvcir.2021.103300
|View full text |Cite
|
Sign up to set email alerts
|

Multi-scale attention network for image super-resolution

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 16 publications
(16 citation statements)
references
References 28 publications
0
6
0
Order By: Relevance
“…On the whole, MAFDN achieves the best performance on all fve datasets. Specifcally, compared with the recent proposed FDIWN [38], DDistill-SR [39], PILN [3], DRSAN [40], ESRT [67], FDSCSR [64], LESR [62], AFAN [61], and FRN [65], on the whole, our MAFDN obtains better performance with comparable or fewer parameters. For example, the proposed MAFDN achieves 0.15 dB/0.0005, 0.17 dB/0.0019, and 0.21 dB/0.0037 PSNR/ SSIM gains over the recent attention-based model AFAN [61] (682K/681K/692K) for × 2, × 3, and × 4 SR on Set5, respectively.…”
Section: Lightweight Sisr Methods For Comparisonsmentioning
confidence: 84%
See 2 more Smart Citations
“…On the whole, MAFDN achieves the best performance on all fve datasets. Specifcally, compared with the recent proposed FDIWN [38], DDistill-SR [39], PILN [3], DRSAN [40], ESRT [67], FDSCSR [64], LESR [62], AFAN [61], and FRN [65], on the whole, our MAFDN obtains better performance with comparable or fewer parameters. For example, the proposed MAFDN achieves 0.15 dB/0.0005, 0.17 dB/0.0019, and 0.21 dB/0.0037 PSNR/ SSIM gains over the recent attention-based model AFAN [61] (682K/681K/692K) for × 2, × 3, and × 4 SR on Set5, respectively.…”
Section: Lightweight Sisr Methods For Comparisonsmentioning
confidence: 84%
“…To demonstrate the capacity and superiority of MAFDN, representative lightweight SISR methods and recent proposed state-of-the-art models are compared, i.e., SRCNN [19], PAN [74], RFDN [58], FDIWN [38], DDistill-SR [39], PILN [3], JSNet [70], DiVANet [59], EMASRN [60], DRSAN [40], ESRT [67], MMSR [68], HPUN [66], FDSCSR [64], LESR [62], IRN [63], AFAN [61], and FRN [65]. Overall, these competitors cover attention-based, feature distillationbased, and NAS-based SISR models.…”
Section: Lightweight Sisr Methods For Comparisonsmentioning
confidence: 99%
See 1 more Smart Citation
“…Very recently, the attention mechanism has acquired wide applications in SR tasks. 29,31,32 Zhang et al 33 pioneered the introduction of channel attention (CA) to image SR reconstruction and showed prominent restoration results. Li 34 introduced a novel spatial attention (SA) that learned a group of weights to emphasize high-frequency information.…”
Section: Introductionmentioning
confidence: 99%
“…Very recently, the attention mechanism has acquired wide applications in SR tasks 29 , 31 , 32 . Zhang et al 33 .…”
Section: Introductionmentioning
confidence: 99%