2018
DOI: 10.48550/arxiv.1803.11019
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Optimal Convergence Rates for Tikhonov Regularization in Besov Spaces

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2018
2018
2019
2019

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(10 citation statements)
references
References 0 publications
0
10
0
Order By: Relevance
“…We start from lower bounds for the estimation of g † = F (f † ), see [12,Thms. 7,9], [39,Cor. 4.12].…”
Section: Lower Boundsmentioning
confidence: 99%
See 1 more Smart Citation
“…We start from lower bounds for the estimation of g † = F (f † ), see [12,Thms. 7,9], [39,Cor. 4.12].…”
Section: Lower Boundsmentioning
confidence: 99%
“…• Finally, in Theorem 6.4 we derive convergence rates for the white noise model (4) using a method proposed in [39]. These rates are optimal for p > 1 and almost optimal for p = 1.…”
Section: Introductionmentioning
confidence: 99%
“…Let us mention that variational source conditions, which have been utilized recently for the analysis of regularization methods for nonlinear inverse problems, also allow for the proof of convergence rates without additional assertions on the nonlinearity of the problem. In this context, we refer to [7] and [17] for discussions of cross connections between conditional stability estimates and variational source conditions.…”
Section: Introductionmentioning
confidence: 99%
“…More recently, there have been approaches (see e.g. [23][24][25]35]) to verify variational source conditions directly for specific problem instances without relying on nonlinearity assumptions or spectral source conditions or on both. In this case, it can happen that the set M allows to interchange f and f † in (11), and hence also a conditional stability estimate follows, see e.g.…”
Section: Assumptionmentioning
confidence: 99%