2015
DOI: 10.1007/s10957-015-0781-1
|View full text |Cite
|
Sign up to set email alerts
|

A Modified Hestenes and Stiefel Conjugate Gradient Algorithm for Large-Scale Nonsmooth Minimizations and Nonlinear Equations

Abstract: It is well known that nonlinear conjugate gradient methods are very effective for large-scale smooth optimization problems. However, their efficiency has not been widely investigated for large-scale nonsmooth problems, which are often found in practice. This paper proposes a modified Hestenes-Stiefel conjugate gradient algorithm for nonsmooth convex optimization problems. The search direction of the proposed method not only possesses the sufficient descent property but also belongs to a trust region. Under sui… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
43
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 95 publications
(43 citation statements)
references
References 87 publications
(105 reference statements)
0
43
0
Order By: Relevance
“…The conjugate gradient [25] transforms the inverse problem into three problems: direct question, sensitive problem, and concomitant problem. The problem is based on the assumed boundary shape (1d) and other boundary conditions, using the boundary element method to solve (1a)-(1e), then calculating the minimum value of the objective function (2).…”
Section: Conjugate Gradient Methodmentioning
confidence: 99%
See 2 more Smart Citations
“…The conjugate gradient [25] transforms the inverse problem into three problems: direct question, sensitive problem, and concomitant problem. The problem is based on the assumed boundary shape (1d) and other boundary conditions, using the boundary element method to solve (1a)-(1e), then calculating the minimum value of the objective function (2).…”
Section: Conjugate Gradient Methodmentioning
confidence: 99%
“…Hong et al [21][22][23][24] optimized the PSO algorithm, which can search for more simultaneous solutions and make it possible to give an unbiased estimate that provides a better way to find the global optimal solution in search space. Yuan et al [25] introduced the improved algorithm of genetic algorithm, particle swarm optimization in two-dimensional steady-state thermal boundary problem.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…This formula is inspired by the idea of these two papers [6,8]. In recent years, lots of scholars like to study the three-term conjugate gradient formula because of its good properties [7].…”
Section: Motivation and Algorithmmentioning
confidence: 99%
“…The HS method has good numerical results for (1); however, the convergent theory is not interesting especially for the nonconvex function. At present, there exist many good conjugate gradients (see [6][7][8], etc.). Yuan, Wei, and Lu [9] gave a modified weak Wolfe-Powell (we called it YWL) line search for steplength designed by…”
Section: Introductionmentioning
confidence: 99%