Conjugate gradient methods are efficient to minimize differentiable objective functions in large dimension spaces. Recently, Dai and Yuan introduced a tree-parameter family of nonlinear conjugate gradient methods and show their convergence. However, line search strategies usually bring computational burden. To overcome this problem, in this paper, we study the global convergence of a special case of three-parameter family(the CD-DY family) in which the line search procedures are replaced by fixed formulae of stepsize.
The shortest-residual family of conjugate gradient methods was first proposed by Hestenes and was studied by Pytlak, and Dai and Yuan. Recently, a no-line-search scheme in conjugate gradient methods was given by Sun and Zhang, and Chen and Sun. In this paper, we show the global convergence of two shortest-residual conjugate gradient methods (FRSR and PRPSR) without line search. In addition, computational results are presented to show that the methods with line search have similar numerical behavior to the methods without line search.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.