2019
DOI: 10.1088/1361-6420/aaf14a
|View full text |Cite
|
Sign up to set email alerts
|

Deep null space learning for inverse problems: convergence analysis and rates

Abstract: Recently, deep learning based methods appeared as a new paradigm for solving inverse problems. These methods empirically show excellent performance but lack of theoretical justification; in particular, no results on the regularization properties are available. In particular, this is the case for two-step deep learning approaches, where a classical reconstruction method is applied to the data in a first step and a trained deep neural network is applied to improve results in a second step. In this paper, we clos… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
87
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 82 publications
(96 citation statements)
references
References 32 publications
2
87
0
Order By: Relevance
“…More precisely, let A † : Y → X be an analytically known reconstruction operator that is proven to be robust. One can then train a convolutional neural network to remove reconstruction artefacts that arise from using A † [14], [15], [26]. These artefacts can be quite notable when data is highly noisy or under-sampled.…”
Section: A Reconstruction and Post-processingmentioning
confidence: 99%
“…More precisely, let A † : Y → X be an analytically known reconstruction operator that is proven to be robust. One can then train a convolutional neural network to remove reconstruction artefacts that arise from using A † [14], [15], [26]. These artefacts can be quite notable when data is highly noisy or under-sampled.…”
Section: A Reconstruction and Post-processingmentioning
confidence: 99%
“…In this work, we compare the performance of the joint`1-minimization algorithm of [1] with deep learning approaches for CS PAT image reconstruction. For the latter we use the residual network [13,20,21] and the nullspace network [22,23]. The nullspace network includes a certain data consistency layer and even has been shown to be a regularization method in [23].…”
Section: Cs Pat Recovery Algorithmsmentioning
confidence: 99%
“…For the latter we use the residual network [13,20,21] and the nullspace network [22,23]. The nullspace network includes a certain data consistency layer and even has been shown to be a regularization method in [23]. Our results show that the nullspace network uniformly outperforms the residual network for CS PAT in terms of the mean squared error (MSE).…”
Section: Cs Pat Recovery Algorithmsmentioning
confidence: 99%
“…Recently, deep learning has achieved impressive results in various CT reconstruction fields [6], including low-dose denoising [7,8,9], sparse-view reconstruction [10], limited angle tomography [11], and metal artifact reduction [12]. In the field of interior tomography, Han and Ye applied the U-Net to remove null space artifacts [13] from FBP reconstruction. Observing its instability, they propose to use DBP reconstruction instead of the FBP reconstruction as the input of the U-Net for various types of ROI reconstruction tasks [14].…”
Section: Introductionmentioning
confidence: 99%