2023
DOI: 10.21203/rs.3.rs-3129748/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Mini-batch Stochastic Recursive Gradient Method with Barzilai-Borwein Step-Size for Machine Learning 

Abstract: As a mini-batch version of the SARAH algorithm, the MB-SARAH algorithm has received extensive attention due to its simple recursive scheme for updating stochastic gradient estimates. In this paper, we give a modification of the MB-SARAH method via cooperating with the BB step-size, shorted to MB-SARAH-BB. The MB-SARAH-BB combines some advantages of both MB-SARAH and BB methods, providing robustness in selecting initial step size during the optimization process. In the framework of MB-SARAH-BB, we propose a nov… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 25 publications
(43 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?