Problem statement:The Conjugate Gradient (CG) algorithm which usually used for solving nonlinear functions is presented and is combined with the modified Back Propagation (BP) algorithm yielding a new fast training multilayer algorithm. Approach: This study consisted of determination of new search directions by exploiting the information calculated by gradient descent as well as the previous search direction. The proposed algorithm improved the training efficiency of BP algorithm by adaptively modifying the initial search direction. Results: Performance of the proposed algorithm was demonstrated by comparing it with the Neural Network (NN) algorithm for the chosen test functions. Conclusion: The numerical results showed that number of iterations required by the proposed algorithm to converge was less than the both standard CG and NN algorithms. The proposed algorithm improved the training efficiency of BP-NN algorithms by adaptively modifying the initial search direction.
<p>In this article we have derived two versions, ξk and ρk were derived from an algorithm based on the first suggested modified Fletcher-Reeves method in the article for the two-term CG method and another term to get a downward search towards the function minimum point with the search for an inaccurate line and we have proved rapprochement. These two algorithms combined with the Cuckoo algorithm to achieve a remarkable performance in reducing the number of repetitions in order to reach the minimization of 10 functions is unconstrained in the numerical results.</p><p> </p>
In this paper, a modified globally convergent self-scaling BFGS algorithm for solving convex unconstrained optimization problems was investigated in which it employs exact line search strategy and the inverse Hessian matrix approximations were positive definite. Experimental results indicate that the new proposed algorithm was more efficient than the standard BFGS-algorithm.
In this study, we tend to propose a replacement hybrid algorithmic rule which mixes the search directions like Steepest Descent (SD) and Quasi-Newton (QN). First, we tend to develop a replacement search direction for combined conjugate gradient (CG) and QN strategies. Second, we tend to depict a replacement positive CG methodology that possesses the adequate descent property with sturdy Wolfe line search. We tend to conjointly prove a replacement theorem to make sure global convergence property is underneath some given conditions. Our numerical results show that the new algorithmic rule is powerful as compared to different standard high scale CG strategies.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.