2021
DOI: 10.48550/arxiv.2109.01664
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Exploring Separable Attention for Multi-Contrast MR Image Super-Resolution

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 26 publications
0
5
0
Order By: Relevance
“…T 2 Net [13], and SANet [14]. The benchmark results are trained using the same ways and training datasets as described in their corresponding works.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…T 2 Net [13], and SANet [14]. The benchmark results are trained using the same ways and training datasets as described in their corresponding works.…”
Section: Resultsmentioning
confidence: 99%
“…Medical image super-resolution methods are classified into two broad categories: single-contrast super-resolution (SCSR) [13,21,22,28,30,[41][42][43] and multi-contrast super-resolution (MCSR) [14,24,27,37,44]. Traditionally, because of their simplicity, bicubic and b-spline interpolations are two of the most frequently used SCSR methods in MRI practice.…”
Section: Medical Image Super-resolutionmentioning
confidence: 99%
See 2 more Smart Citations
“…In the field of deep convolutional neural networks, attention mechanism plays a crucial role to reconstruct the high-quality MRI images. In this regard, Feng et al [59] suggested the idea of separable attention for Multi-contrast MRI image SR. Separable attention mechanism is used to extract the contrast anatomical information such as blood vessels, tissues, and bones. Chen et al [60] proposed trusted deep CNN based SR model known as feedback adaptive weighted dense network (FWDN) to reconstruct the high-resolution medical image from the low-resolution medical input image.…”
Section: Related Workmentioning
confidence: 99%