2019
DOI: 10.1007/s10851-019-00911-1
|View full text |Cite
|
Sign up to set email alerts
|

Big in Japan: Regularizing Networks for Solving Inverse Problems

Abstract: Deep learning and (deep) neural networks are emerging tools to address inverse problems and image reconstruction tasks. Despite outstanding performance, the mathematical analysis for solving inverse problems by neural networks is mostly missing. In this paper, we introduce and rigorously analyze families of deep regularizing neural networks (RegNets) of the form B + N @A B , where B is a classical regularization and the network N @A B is trained to recover the missing part Id X B not found by the classical reg… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
7
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
2

Relationship

3
6

Authors

Journals

citations
Cited by 13 publications
(7 citation statements)
references
References 25 publications
(50 reference statements)
0
7
0
Order By: Relevance
“…In this section we describe the proposed deep learning method for solving the discrete linear system (3). As it is shown in [11] the combination of a classical regularization method with deep CNNs yields a regularization method together with quantitative error estimates. Here we use the truncated SVD decomposition as classical regularization method and combine it with a deep network that recovers the truncated coefficients.…”
Section: Deep Learning Of Singular Value Expansionmentioning
confidence: 99%
See 1 more Smart Citation
“…In this section we describe the proposed deep learning method for solving the discrete linear system (3). As it is shown in [11] the combination of a classical regularization method with deep CNNs yields a regularization method together with quantitative error estimates. Here we use the truncated SVD decomposition as classical regularization method and combine it with a deep network that recovers the truncated coefficients.…”
Section: Deep Learning Of Singular Value Expansionmentioning
confidence: 99%
“…This includes CNNs in iterative schemes [5,6,1], or using a single CNN to improve an initial reconstruction [4,9,10]. Very recently, we have proven that sequences of suitable neural networks in combination with classical regularizations lead to convergent regularization methods [11]. In the present paper, we apply the concept of regularizing networks to the limited view problem of photoacoustic tomography (PAT).…”
Section: Introductionmentioning
confidence: 99%
“…We then train a neural network to predict the associated singular values from the model outputs, making it possible to compute an approximate Jacobian without the need for numerical differentiation, resulting in the proposed neural network augmented Quasi-Newton method (NN-QN). In related studies, researchers have proposed learned SVD frameworks in applications to regularized inverse problems [25]- [27] We will first detail the mathematics and framework for the NN-QN approach in section II. We then present an application to the highly nonlinear inverse problem of electrical impedance tomography in section III.…”
Section: Introductionmentioning
confidence: 99%
“…In order to address the ill-posedness of linear inverse problems, regularizing networks of the form R a Φ G were introduced in [27,26]. Here G X Y 3 X defines any regularization and Φ X X 3 X are trained neural networks approximating data-consistent networks.…”
Section: Introductionmentioning
confidence: 99%