2018
DOI: 10.1155/2018/5057096
|View full text |Cite
|
Sign up to set email alerts
|

A New Modified Three-Term Hestenes–Stiefel Conjugate Gradient Method with Sufficient Descent Property and Its Global Convergence

Abstract: This paper describes a modified three-term Hestenes–Stiefel (HS) method. The original HS method is the earliest conjugate gradient method. Although the HS method achieves global convergence using an exact line search, this is not guaranteed in the case of an inexact line search. In addition, the HS method does not usually satisfy the descent property. Our modified three-term conjugate gradient method possesses a sufficient descent property regardless of the type of line search and guarantees global convergence… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 40 publications
0
7
0
Order By: Relevance
“…This article modified and extended the work of Baluch et al [45] to solve nonlinear monotone operator equations. The modification become necessary so as to establish the descent and boundedness property of the search direction without the use of the line search.…”
Section: Discussionmentioning
confidence: 98%
See 3 more Smart Citations
“…This article modified and extended the work of Baluch et al [45] to solve nonlinear monotone operator equations. The modification become necessary so as to establish the descent and boundedness property of the search direction without the use of the line search.…”
Section: Discussionmentioning
confidence: 98%
“…The search direction generated by the algorithm is of three term and does not require the derivative of the operator. To have a good understanding of the motivation, the algorithm proposed by Baluch et al [45] for finding solution to the unconstrained optimization problem is recalled. Consider the unconstrained optimization problem:…”
Section: Algorithmmentioning
confidence: 99%
See 2 more Smart Citations
“…where f : R n ⟶ R is the objective function, h k ðxÞ: R n ⟶ R are equality constraint functions, g k ðxÞ: R n ⟶ R are inequality constraint functions, and C 2 are continuous differentiable functions. One of the main ideas of solving unconstrained NLP is by searching for the next point by choosing proper search direction d k and the stepsize α k as in the Newton direction [45], trust-region algorithm for unconstrained optimization [46]; the descent method [47], conjugate gradient method [48], three-term conjugate gradient method [49], and subspace method for nonlinear optimization [50]; the hybrid method for convex NLP [51]; CCM for optimization problem and application [52]; and descent direction stochastic approximation for optimization problem [53]. But there are studies for other approaches.…”
Section: Construction Of Oham-ls With Fogbds Generated By Nlpcopsmentioning
confidence: 99%