2009
DOI: 10.1016/j.cam.2008.05.003
|View full text |Cite
|
Sign up to set email alerts
|

Arnoldi–Tikhonov regularization methods

Abstract: a b s t r a c tTikhonov regularization for large-scale linear ill-posed problems is commonly implemented by determining a partial Lanczos bidiagonalization of the matrix of the given system of equations. This paper explores the possibility of instead computing a partial Arnoldi decomposition of the given matrix. Computed examples illustrate that this approach may require fewer matrix-vector product evaluations and, therefore, less arithmetic work. Moreover, the proposed range-restricted Arnoldi-Tikhonov regula… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
58
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 58 publications
(59 citation statements)
references
References 24 publications
(53 reference statements)
1
58
0
Order By: Relevance
“…Here, as defined in the abstract, a best possible regularized solution means that it is at least as accurate as the best regularized solution obtained by the truncated singular value decomposition (TSVD) method. Otherwise, it is said to have the partial regularization; in this case, in order to compute a best possible regularized solution, its hybrid variant, e.g., a hybrid LSQR, is needed that combines the solver with additional regularization [5,13,28,30,31,32], which aims to remove the effects of small Ritz values, and expand the Krylov subspace until it captures all the dominant SVD components needed and the method obtains a best possible regularized solution. The study of the regularizing effects of LSQR and CGLS has been receiving intensive attention for years; see [20,22] and the references therein.…”
Section: T Sv D K0mentioning
confidence: 99%
See 1 more Smart Citation
“…Here, as defined in the abstract, a best possible regularized solution means that it is at least as accurate as the best regularized solution obtained by the truncated singular value decomposition (TSVD) method. Otherwise, it is said to have the partial regularization; in this case, in order to compute a best possible regularized solution, its hybrid variant, e.g., a hybrid LSQR, is needed that combines the solver with additional regularization [5,13,28,30,31,32], which aims to remove the effects of small Ritz values, and expand the Krylov subspace until it captures all the dominant SVD components needed and the method obtains a best possible regularized solution. The study of the regularizing effects of LSQR and CGLS has been receiving intensive attention for years; see [20,22] and the references therein.…”
Section: T Sv D K0mentioning
confidence: 99%
“…When A is nonsymmetric and multiplication with A T is difficult or impractical to compute, GMRES and its preferred variant RRGMRES are candidates [10,30]. The hybrid approach based on the Arnoldi process was first introduced in [11], and has been studied in [9,11,12,28]. Recently, Gazzola et al [15,16,17,31] have studied more methods based on the Lanczos bidiagonalization, the Arnoldi process and the nonsymmetric Lanczos process for the severely ill-posed problem (1.1).…”
Section: T Sv D K0mentioning
confidence: 99%
“…The value of the regularization parameter is computed with Newton's method. We remark that the computational effort for determining µ is negligible, because all computations are carried out with small matrices and vectors (of order about p × p and p, respectively); see, e.g., [20] for a description of similar computations. We compare the performance of several square regularization matrices.…”
Section: Numerical Examplesmentioning
confidence: 99%
“…Each step of the Arnoldi process only demands the evaluation of one matrix-vector product with A. For many linear discrete ill-posed problems (1), solution methods for (4) based on the Arnoldi process require fewer matrix-vector product evaluations than solution methods based on partial Lanczos bidiagonalization; see [4,5,20,25] for illustrations. We therefore are interested in the derivation of square regularization matrices.…”
Section: Introductionmentioning
confidence: 99%
“…As well known, these methods typically show a fast superlinear convergence when applied to discrete ill-posed problems, and hence they are particularly attractive for large scale problems. Dealing with this kind of methods, efficient algorithms based on the solution of (5) have been considered in [14] and [22]. More recently, in [5] a very simple strategy for solving (5), based on the linearization of φ m (λ), has been presented.…”
Section: Introductionmentioning
confidence: 99%