2020
DOI: 10.1002/cpe.6109
|View full text |Cite
|
Sign up to set email alerts
|

Image super‐resolution with parallel convolution attention network

Abstract: In recent years, deep convolutional neural networks (CNNs) have achieved a lot of outstanding results in super‐resolution with superior ability. However, the majority of CNNs only use a series of convolution kernels with the same size to extract features. This will cause limited receptive fields. In this work, we propose a parallel convolution attention network (PCAN) to extract features in an effective way. Specifically, a pair of parallel convolutions (PCs) with different kernel sizes is used in one layer in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 42 publications
0
1
0
Order By: Relevance
“…This will cause limited receptive fields. In the contribution by Zhang et al, “Image super‐resolution with parallel convolution attention network,” the authors propose a parallel convolution attention network (PCAN) to extract features in an effective way 9 . Specifically, a pair of parallel convolutions (PCs) with different kernel sizes is used in one layer in their network, which can extract features within different receptive fields, thereby making full use of the multiscale information.…”
Section: Themes Of This Special Issuementioning
confidence: 99%
“…This will cause limited receptive fields. In the contribution by Zhang et al, “Image super‐resolution with parallel convolution attention network,” the authors propose a parallel convolution attention network (PCAN) to extract features in an effective way 9 . Specifically, a pair of parallel convolutions (PCs) with different kernel sizes is used in one layer in their network, which can extract features within different receptive fields, thereby making full use of the multiscale information.…”
Section: Themes Of This Special Issuementioning
confidence: 99%