2022
DOI: 10.48550/arxiv.2209.08951
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Generalization Bounds for Stochastic Gradient Descent via Localized $\varepsilon$-Covers

Abstract: In this paper, we propose a new covering technique localized for the trajectories of SGD. This localization provides an algorithm-specific complexity measured by the covering number, which can have dimensionindependent cardinality in contrast to standard uniform covering arguments that result in exponential dimension dependency. Based on this localized construction, we show that if the objective function is a finite perturbation of a piecewise strongly convex and smooth function with P pieces, i.e. non-convex … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 7 publications
(14 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?