2022
DOI: 10.48550/arxiv.2204.07204
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Test-Time Training Can Close the Natural Distribution Shift Performance Gap in Deep Learning Based Compressed Sensing

Abstract: Deep learning based image reconstruction methods outperform traditional methods in accuracy and runtime. However, neural networks suffer from a performance drop when applied to images from a different distribution than the training images. For example, a model trained for reconstructing knees in accelerated magnetic resonance imaging (MRI) does not reconstruct brains well, even though the same network trained on brains reconstructs brains perfectly well. Thus there is a distribution shift performance gap for a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 10 publications
0
1
0
Order By: Relevance
“…Darestani et al [ 121 ] denoted this performance drop as a “distribution shift performance gap”. Furthermore, the authors proposed a domain adaptation method that successfully reduced the performance gap by 87%–99%.…”
Section: Pitfalls and Future Outlookmentioning
confidence: 99%
“…Darestani et al [ 121 ] denoted this performance drop as a “distribution shift performance gap”. Furthermore, the authors proposed a domain adaptation method that successfully reduced the performance gap by 87%–99%.…”
Section: Pitfalls and Future Outlookmentioning
confidence: 99%