2009
DOI: 10.1080/10556780802370746
|View full text |Cite
|
Sign up to set email alerts
|

Comparison of advanced large-scale minimization algorithms for the solution of inverse ill-posed problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
32
0

Year Published

2010
2010
2021
2021

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 38 publications
(33 citation statements)
references
References 34 publications
0
32
0
Order By: Relevance
“…Comparisons of different optimization methods for data assimilation with fluid flow models are given by Alekseev and Navon [3] and Daescu and Navon [23]. QuasiNewton methods approximate the inverse of the Hessian matrix by a symmetric positive definite matrix, which is updated at every step using the new search directions and the new gradients.…”
Section: The Chemical Transport Model and The Test Casementioning
confidence: 99%
“…Comparisons of different optimization methods for data assimilation with fluid flow models are given by Alekseev and Navon [3] and Daescu and Navon [23]. QuasiNewton methods approximate the inverse of the Hessian matrix by a symmetric positive definite matrix, which is updated at every step using the new search directions and the new gradients.…”
Section: The Chemical Transport Model and The Test Casementioning
confidence: 99%
“…At the end of the backward in time integration (13) provides the gradient and the Hessian vector product…”
Section: Discrete Soamentioning
confidence: 99%
“…al. compared the performance of different minimization algorithms in the solution of inverse problems [13]. Exploring this previous work constitutes the motivation behind the following research results.…”
Section: Introductionmentioning
confidence: 99%
“…Then it slows down after a relatively small number of iterations. Possibly, this is linked to the number of Hessian dominant eigenvalues [33]. From this viewpoint, the regularization is an important property of conjugate gradient algorithms [32].…”
Section: Numerical Results and Comparisonsmentioning
confidence: 99%