2021
DOI: 10.48550/arxiv.2102.00931
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Information-Theoretic Generalization Bounds for Stochastic Gradient Descent

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
29
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(29 citation statements)
references
References 0 publications
0
29
0
Order By: Relevance
“…The bound was subsquently improved by Negrea et al (2019); Haghifam et al (2020); Rodríguez-Gálvez et al (2020); Wang et al (2021). Inspired by the work of Pensia et al (2018), Neu (2021) presents an information-theoretic analysis of the models trained with SGD. The analysis of Neu (2021) constructs an auxiliary weight process parallel to SGD training and upper-bounds the generalization error through this auxiliary process.…”
Section: Introductionmentioning
confidence: 99%
See 4 more Smart Citations
“…The bound was subsquently improved by Negrea et al (2019); Haghifam et al (2020); Rodríguez-Gálvez et al (2020); Wang et al (2021). Inspired by the work of Pensia et al (2018), Neu (2021) presents an information-theoretic analysis of the models trained with SGD. The analysis of Neu (2021) constructs an auxiliary weight process parallel to SGD training and upper-bounds the generalization error through this auxiliary process.…”
Section: Introductionmentioning
confidence: 99%
“…Our development builds upon that of Neu (2021). Following the same construction of the auxiliary weight process in Neu (2021), we present upper bounds of generalization error that improve upon Neu (2021) in two ways.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations