2021
DOI: 10.48550/arxiv.2111.07322
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

CSG: A stochastic gradient method for a wide class of optimization problems appearing in a machine learning or data-driven context

Abstract: A recent article introduced the continuous stochastic gradient method (CSG) for the efficient solution of a class of stochastic optimization problems. While the applicability of known stochastic gradient type methods is typically limited to expected risk functions, no such limitation exists for CSG. This advantage stems from the computation of design dependent integration weights, allowing for optimal usage of available information and therefore stronger convergence properties. However, the nature of the formu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(6 citation statements)
references
References 17 publications
0
6
0
Order By: Relevance
“…The initial design was chosen at random. Furthermore, the integration weights were obtained by the inexact hybrid method proposed in [14], while the step size was determined by the Armijo-type line search introduced in [15]. In this setting, it appears that the CSG approximations almost immediately give very close approximations to the full objective function, see Figure 3b.…”
Section: Results For (1) Optimizing Color Of Nanoparticles In Solutionmentioning
confidence: 99%
See 4 more Smart Citations
“…The initial design was chosen at random. Furthermore, the integration weights were obtained by the inexact hybrid method proposed in [14], while the step size was determined by the Armijo-type line search introduced in [15]. In this setting, it appears that the CSG approximations almost immediately give very close approximations to the full objective function, see Figure 3b.…”
Section: Results For (1) Optimizing Color Of Nanoparticles In Solutionmentioning
confidence: 99%
“…The approximations to the objective function and the full gradient generated by the CSG method are asymptotically exact, as the next proposition shows. Proposition 3.1 Denote by Ĝn and Ĵn the CSG approximations to ∇J(u n ) and J(u n ) respectively, where the integration weights were calculated by one of the methods listed in [14]. Then, it holds…”
Section: The Continuous Stochastic Gradient Descent Methodsmentioning
confidence: 99%
See 3 more Smart Citations