2020
DOI: 10.3934/era.2020115
|View full text |Cite
|
Sign up to set email alerts
|

A survey of gradient methods for solving nonlinear optimization

Abstract: <p style='text-indent:20px;'>The paper surveys, classifies and investigates theoretically and numerically main classes of line search methods for unconstrained optimization. Quasi-Newton (QN) and conjugate gradient (CG) methods are considered as representative classes of effective numerical methods for solving large-scale unconstrained optimization problems. In this paper, we investigate, classify and compare main QN and CG methods to present a global overview of scientific advances in this field. Some o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 22 publications
(11 citation statements)
references
References 125 publications
0
9
0
Order By: Relevance
“…k ≥ 1. More details about accelerated gradient methods can be found in [19,22,23]. Since ℷ k ′ (α k ) � 0 for α k � (1/2) and ℷ k (0) � ℷ k (1) � 1, mathematical analysis of the function ℷ k (α k ) in the interval α k ∈ (0, 1] reveals maxℷ k (α k ) � ℷ k (1/2) � (5/4) and 1 ≤ ℷ k ≤ (5/4).…”
Section: Introduction and Overview Of Related Resultsmentioning
confidence: 99%
“…k ≥ 1. More details about accelerated gradient methods can be found in [19,22,23]. Since ℷ k ′ (α k ) � 0 for α k � (1/2) and ℷ k (0) � ℷ k (1) � 1, mathematical analysis of the function ℷ k (α k ) in the interval α k ∈ (0, 1] reveals maxℷ k (α k ) � ℷ k (1/2) � (5/4) and 1 ≤ ℷ k ≤ (5/4).…”
Section: Introduction and Overview Of Related Resultsmentioning
confidence: 99%
“…This is a similar equation as in the gradient descent method with the exact line search [31]. By using (18), we can always find α that assures modulus reduction.…”
Section: Algorithm 2: Computation Of N Pmentioning
confidence: 99%
“…This method may give the best value of α, but it is computationally too expensive in practice. Instead, we can use a method similar to the backtracking line search known in optimisation [31]. The proposed method is presented in Algorithm 3.…”
Section: Algorithm 2: Computation Of N Pmentioning
confidence: 99%
“…In a wide range of practical disciplines such as machine learning and signal processing, large scale continuous optimization models emerge, often as the unconstrained optimization problem min 𝑥∈R 𝑛 𝑓 (𝑥), (1.1) where the objective function 𝑓 is here assumed to be smooth. Scholar studies reflect the value of the CG techniques among the various continuous optimization algorithms [95]. Initially founded by Hestenes and Stiefel (HS) [59] in the midst of the previous century for solving positive definite systems of linear equations, and then adopted by Fletcher and Reeves [51] for the unconstrained optimization, CG algorithms benefit the low memory storage and simple iterations squarely as well as the second order information implicitly [57,83].…”
Section: Introductionmentioning
confidence: 99%