2015
DOI: 10.1007/978-3-319-19749-4_3
|View full text |Cite
|
Sign up to set email alerts
|

Low Complexity Regularization of Linear Inverse Problems

Abstract: Inverse problems and regularization theory is a central theme in imaging sciences, statistics and machine learning. The goal is to reconstruct an unknown vector from partial indirect, and possibly noisy, measurements of it. A now standard method for recovering the unknown vector is to solve a convex optimization problem that enforces some prior knowledge about its structure. This chapter delivers a review of recent advances in the field where the regularization prior promotes solutions conforming to some notio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
33
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
5
1
1

Relationship

4
3

Authors

Journals

citations
Cited by 19 publications
(33 citation statements)
references
References 210 publications
(248 reference statements)
0
33
0
Order By: Relevance
“…Therefore, the concept of partly smooth functions seems to provide with a current set of functions that englobes a lot of popular objective models enjoying a powerful calculus and sensitivity theory. In paper [9], where the theory was introduced, a lot of important examples are commented, and in numerous recent papers [6,12,13,20], where the concept of partial smoothness is a key point, more popular objective functions are described as partly smooth functions. For instance, various functions studied in signal processing and machine learning are partly smooth.…”
Section: Remarkmentioning
confidence: 99%
See 2 more Smart Citations
“…Therefore, the concept of partly smooth functions seems to provide with a current set of functions that englobes a lot of popular objective models enjoying a powerful calculus and sensitivity theory. In paper [9], where the theory was introduced, a lot of important examples are commented, and in numerous recent papers [6,12,13,20], where the concept of partial smoothness is a key point, more popular objective functions are described as partly smooth functions. For instance, various functions studied in signal processing and machine learning are partly smooth.…”
Section: Remarkmentioning
confidence: 99%
“…In subsequent papers [12,13,20], the authors extend the definition to convex cases. In this paper, we consider a slightly modified convex version.…”
Section: Partial Smoothnessmentioning
confidence: 99%
See 1 more Smart Citation
“…Although R λ,ρ obviously differs from the true solution R λ , it is possible to show, following an approach similar to [48], that under some mild nondegeneracy hypothesis on η λ,ρ , and for small enough values of ρ, R λ,ρ is sufficiently close to R λ to allow accurate support reconstruction. In particular, both matrices have the same rank.…”
Section: Sensitivity Analysismentioning
confidence: 99%
“…The structure of the 1 norm plays a crucial role in their result. Sufficient (but not necessary in general) conditions for robust recovery with a linear convergence rate have been proposed for general class of regularizers by solving either (1.4) or (1.5); see [10,14,39,42,36] and references therein. For instance [10] and [36] provide a sufficient condition via the geometric notion of descent cone for robust recovery with linear rate by solving (1.4) (but not (1.5)).…”
Section: Introduction 1problem Statementmentioning
confidence: 99%