In this paper, we have investigated a new scaling parameter in the standard memoryless L-BFGS algorithm. This new consideration is compared with the standard L-BFGS method under the assumption of the L-BFGS method with m=3, and by using ten nonlinear different dimensionality test problems. The new modification performs very effective numerical results compared with the standard algorithm.
<p>This work suggests several multi-step three-term Conjugate Gradient (CG)-algorithms that satisfies their sufficient descent property and conjugacy conditions. First, we have considered a number of well-known three-term CG-method, and we have, therefore, suggested two new classes of this type of algorithms which was based on Hestenes and Stiefel (HS) and Polak-Ribière (PR) formulas with four different versions. Both descent and conjugacy conditions for all the proposed algorithms are satisfied, at each iteration by using the strong Wolfe line search condition and it's accelerated version. These new suggested algorithms are some sort of modifications to the original HS and PR methods. These CG-algorithms are considered as a sort of the memoryless BFGS update. All of our new suggested methods are proved to be a global convergent and numerically, more efficient than the similar methods in same area based on our selected set of used numerical problems.</p>
The use of the self-scaling Broyden-Fletcher-Goldfarb-Shanno (BFGS) method is very efficient for the resolution of large-scale optimization problems, in this paper, we present a new algorithm and modified the self-scaling BFGS algorithm. Also, based on noticeable non-monotone line search properties, we discovered and employed a new non-monotone idea. Thereafter first, an updated formula is exhorted to the convergent Hessian matrix and we have achieved the secant condition, second, we established the global convergence properties of the algorithm under some mild conditions and the objective function is not convexity hypothesis. A promising behavior is achieved and the numerical results are also reported of the new algorithm.
ﺍﻟﺨﻼﺼﺔ ﺠﺩﻴﺩﺓ ﺨﻭﺍﺭﺯﻤﻴﺔ ﺍﺴﺘﺤﺩﺍﺙ ﺘﻡ ﺍﻟﺒﺤﺙ ﻫﺫﺍ ﻓﻲ ﺍ ﺤﻘل ﻓﻲ ﻟﻠﻤﺘﺭﻱ ﺍﻟﻤﺘﻭﺍﺯﻴﺔ ﻟﺨﻭﺍﺭﺯﻤﻴﺎﺕ ﺸﺭﻭﻁ ﻋﻠﻰ ﺒﺎﻻﻋﺘﻤﺎﺩ ﺍﻟﻤﺘﻐﻴﺭ (Wolf-powell) ﻭﺒﺎﺴﺘﺨﺩﺍﻡ ) 32 ( ﺍﻟﺤﺼﻭل ﻤﻊ ﺨﻁﻴﺔ ﻻ ﺩﺍﻟﺔ ﺍﻟﻘﻴﺎﺴﻴﺔ ﺒﺎﻟﺨﻭﺍﺭﺯﻤﻴﺔ ﻤﻘﺎﺭﻨﺔ ﻤﺸﺠﻌﺔ ﻨﺘﺎﺌﺞ ﻋﻠﻰ .
AbstractIn this paper, a new optimal parallel line search step-size is designed to improve the parallel VM algorithm and satisfies Wolf's-Powell condition by using thirty-two non-linear test problems. The new proposed algorithm has been worked well on our selected test problems, and it has a superiority on the standard algorithm.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.