2006
DOI: 10.1590/s0101-82052006000100006
|View full text |Cite
|
Sign up to set email alerts
|

The effect of the nonlinearity on GCV applied to conjugate gradients in computerized tomography

Abstract: Abstract.We study the effect of the nonlinear dependence of the iterate x k of Conjugate Gradients method (CG) from the data b in the GCV procedure to stop the iterations. We compare two versions of using GCV to stop CG. In one version we compute the GCV function with the iterate x k depending linearly from the data b and the other one depending nonlinearly. We have tested the two versions in a large scale problem: positron emission tomography (PET). Our results suggest the necessity of considering the nonline… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
7
0

Year Published

2010
2010
2016
2016

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 30 publications
0
7
0
Order By: Relevance
“…We employed k max = N . In analogy with [15], the error is normalized so that its scale is comparable with the scale of the stopping criterion functions. The minimal product (MP) function is the most economical criterion, but did not predict early enough the error increase when the perturbations were given by α = 10 −3 and α = 10 −2 .…”
Section: Numerical Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…We employed k max = N . In analogy with [15], the error is normalized so that its scale is comparable with the scale of the stopping criterion functions. The minimal product (MP) function is the most economical criterion, but did not predict early enough the error increase when the perturbations were given by α = 10 −3 and α = 10 −2 .…”
Section: Numerical Resultsmentioning
confidence: 99%
“…We consider the classical conjugate gradient method for normal equations to solve the leastsquares linear system (2.9), which is often referred to as CGLS [3,10,15]. Given a initial guess s 0 and a maximum number of iterations k max , r 0 ← t − Gs 0 , z 0 ← G T r 0 , and p 0 ← z 0 ; repeat…”
Section: Conjugate Gradient Methods and Stopping Criteriamentioning
confidence: 99%
See 1 more Smart Citation
“…The discrepancy principle, widely used as a stopping rule, does not produce sufficiently accurate estimates of the stopping index. The use of the Generalized Cross Validation rule (GCV) [5,12,14] gives better results, even if in some cases it is not fully reliable.…”
mentioning
confidence: 99%
“…i . To obtain the same regularizing effect we would get by applying Tikhonov method with the optimal parameter λ T , a value for λ in (12) should verify ψ…”
mentioning
confidence: 99%