2018
DOI: 10.3934/ipi.2018026
|View full text |Cite
|
Sign up to set email alerts
|

Morozov principle for Kullback-Leibler residual term and Poisson noise

Abstract: We study the properties of a regularization method for inverse problems corrupted by Poisson noise with Kullback-Leibler divergence as data term. The regularization parameter is chosen according to a Morozov type principle. We show that this method of choice of the parameter is well-defined. This a posteriori choice leads to a convergent regularization method. Convergences rates are obtained for this a posteriori choice of the regularization parameter when some source condition is satisfied.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
10
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(15 citation statements)
references
References 35 publications
0
10
0
Order By: Relevance
“…Convergence rates for this choice of α n in the case of an exact operator and an arbitrary convex regularisation functional were obtained in [ 11 ]. For the data fidelity given by the Kullback–Leibler divergence, the discrepancy principle is studied in [ 13 ].…”
Section: Discrepancy Principlementioning
confidence: 99%
“…Convergence rates for this choice of α n in the case of an exact operator and an arbitrary convex regularisation functional were obtained in [ 11 ]. For the data fidelity given by the Kullback–Leibler divergence, the discrepancy principle is studied in [ 13 ].…”
Section: Discrepancy Principlementioning
confidence: 99%
“…Convergence rates for this choice of α n in the case of an exact operator and an arbitrary convex regularisation functional were obtained in [10]. For the data fidelity given by the Kullback-Leibler divergence, the discrepancy principle is studied in [12].…”
Section: Discrepancy Principlementioning
confidence: 99%
“…For a-priori parameter choice rules, convergence rates for solutions of (1.2) in different scenarios have been obtained, e.g., in [4,5,6,7,8]. A classical a-posteriori parameter choice rule is the so-called discrepancy principle originally introduced in [9] and later studied in, e.g., [10,11,12]. Roughly speaking, it consists in choosing α = α(f δ , δ) such that the following equation is satisfied…”
mentioning
confidence: 99%
“…Existence and uniqueness of the resulting penalized maximum likelihood estimators were proved in [3]. This was further refined in [26] by proving consistency and convergence rates of the KL risk under some source conditions. However, as the KL distance primarily measures the closeness of the distributions and our main interest is in function rather than distribution estimation, we prefer to work with the L 2 loss.…”
mentioning
confidence: 95%
“…In this article, we propose a new version of the discrepancy principle for Poisson inverse problems that, when compared to [3,26], is applicable to a broader class of estimators (i.e., not just penalized maximum likelihood estimators), leads to consistent (in probability) filter-induced estimators with explicit and close to optimal rates of convergence under L 2 distance, and does not require any discretization, which may be of importance, because it is known that discretization effects are not always negligible and may affect the convergence rates ( [29,30]).…”
mentioning
confidence: 99%