2011
DOI: 10.3934/ipi.2011.5.219
|View full text |Cite
|
Sign up to set email alerts
|

Structural stability in a minimization problem and applications to conductivity imaging

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
19
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 18 publications
(19 citation statements)
references
References 14 publications
0
19
0
Order By: Relevance
“…In the recent years many authors have spent a significant effort to study weighted least gradient problems and further generalizations, due to its various applications to such areas as imaging conductivity problems, reduced models in superconductivity and superfluidity, models for a description of landsliding, and relaxed models in the theory of elasticity and in optimal design, among others. A list of important investigations in these directions can be found in [4,5,13,14,[17][18][19]21,23,[26][27][28][29][32][33][34][35][36][41][42][43]. In addition, the time dependent notion of total variation flow has proved to be useful in image processing including denoising and restoration, see for example [2,3,6,8,25].…”
Section: Introductionmentioning
confidence: 99%
“…In the recent years many authors have spent a significant effort to study weighted least gradient problems and further generalizations, due to its various applications to such areas as imaging conductivity problems, reduced models in superconductivity and superfluidity, models for a description of landsliding, and relaxed models in the theory of elasticity and in optimal design, among others. A list of important investigations in these directions can be found in [4,5,13,14,[17][18][19]21,23,[26][27][28][29][32][33][34][35][36][41][42][43]. In addition, the time dependent notion of total variation flow has proved to be useful in image processing including denoising and restoration, see for example [2,3,6,8,25].…”
Section: Introductionmentioning
confidence: 99%
“…According to classical results (see [20]), the resulting algorithm using (23) and (24) is globally convergent to the set:…”
Section: Descent Algorithm and Global Convergencementioning
confidence: 99%
“…The search criteria (23) and (24) can be extended to a finite number of directions. Given , , ∈ (0, 1), > , p ≥ 1, in steps A3) and A4), we look for p directions D k = (d k 1 , , d k p ) ∈ X p and p step lengths k = ( k1 , , kp ) ∈ p + , such that:…”
Section: Descent Algorithm and Global Convergencementioning
confidence: 99%
See 1 more Smart Citation
“…Let n be sufficiently large so that a n ≤ 2 a . Recall the functional F δn (·; a n ) in (27) with δ = δ n and a = a n , and the induced norm on H 1 (Ω) in (28). We estimate min ǫ 2z , δ n 2 u n − h 2 1 ≤ F δn (u n − h; a n ) ≤ F δn (u n − h; a n ) + Ω a n |∇u n |dx = G δn (u n ; a n ) ≤ G δn (h; a n )…”
Section: Convergence Properties Of the Regularized Minimizing Sequencementioning
confidence: 99%