2022
DOI: 10.3390/math10193595
|View full text |Cite
|
Sign up to set email alerts
|

A Family of Hybrid Stochastic Conjugate Gradient Algorithms for Local and Global Minimization Problems

Abstract: This paper contains two main parts, Part I and Part II, which discuss the local and global minimization problems, respectively. In Part I, a fresh conjugate gradient (CG) technique is suggested and then combined with a line-search technique to obtain a globally convergent algorithm. The finite difference approximations approach is used to compute the approximate values of the first derivative of the function f. The convergence analysis of the suggested method is established. The comparisons between the perform… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 93 publications
0
6
0
Order By: Relevance
“…Conjugate gradient methods (CGs) are associated with a very strong global convergence theory for a local minimizer and they have low memory requirements. Moreover, in practice, combining the CG method with a line search strategy showed merit in dealing with an unconstrained minimization problem [20][21][22][23][24][25].…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Conjugate gradient methods (CGs) are associated with a very strong global convergence theory for a local minimizer and they have low memory requirements. Moreover, in practice, combining the CG method with a line search strategy showed merit in dealing with an unconstrained minimization problem [20][21][22][23][24][25].…”
Section: Introductionmentioning
confidence: 99%
“…Recently, many novel formulas for determining the parameter β k have been suggested, with those corresponding to CG methods including two or three terms (see [21,22,25,51,52])…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Sundry's advanced methods provided fair results when numerically calculating the gradient vector values. See, for example, [34][35][36][37][43][44][45].…”
Section: Introductionmentioning
confidence: 99%
“…For example, the authors of [43,44] proposed a random mechanism for selecting the optimal finite-difference interval h and presented a good results when solving unconstrained minimization problems. However, when solving Problem (1), using the numerical differentiation methods to compute the approximate values of the Jacobian matrix is extremely expensive.…”
Section: Introductionmentioning
confidence: 99%