2020
DOI: 10.2478/caim-2020-0002
|View full text |Cite
|
Sign up to set email alerts
|

Subsampled Nonmonotone Spectral Gradient Methods

Abstract: This paper deals with subsampled spectral gradient methods for minimizing finite sums. Subsample function and gradient approximations are employed in order to reduce the overall computational cost of the classical spectral gradient methods. The global convergence is enforced by a nonmonotone line search procedure. Global convergence is proved provided that functions and gradients are approximated with increasing accuracy. R-linear convergence and worst-case iteration complexity is investigated in case of stron… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 19 publications
(22 reference statements)
0
4
0
Order By: Relevance
“…This allows us to apply a method of the Spectral Projected Gradient type. The Spectral Projected Gradient (SPG) method, originally proposed in [5], is well known for its efficiency and simplicity and it has been widely used and developed as a solver of constrained optimization problems [4], [8], [11], [16]. The step length selection strategy in SPG method is crucial for faster convergence with respect to classical gradient projection methods because it involves second-order information related to the spectrum of the Hessian matrix.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations
“…This allows us to apply a method of the Spectral Projected Gradient type. The Spectral Projected Gradient (SPG) method, originally proposed in [5], is well known for its efficiency and simplicity and it has been widely used and developed as a solver of constrained optimization problems [4], [8], [11], [16]. The step length selection strategy in SPG method is crucial for faster convergence with respect to classical gradient projection methods because it involves second-order information related to the spectrum of the Hessian matrix.…”
Section: Introductionmentioning
confidence: 99%
“…The step length selection strategy in SPG method is crucial for faster convergence with respect to classical gradient projection methods because it involves second-order information related to the spectrum of the Hessian matrix. SPG methods for finite sums problem have been investigated in [4], [16]. In [16] they are used in combination with the stochastic gradient method and the convergence is proved assuming that the full gradient is calculated in every m iterations.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations