2023
DOI: 10.1109/tit.2022.3222913
|View full text |Cite
|
Sign up to set email alerts
|

Asymptotic Errors for Teacher-Student Convex Generalized Linear Models (Or: How to Prove Kabashima’s Replica Formula)

Abstract: There has been a recent surge of interest in the study of asymptotic reconstruction performance in various cases of generalized linear estimation problems in the teacher-student setting, especially for the case of i.i.d standard normal matrices. Here, we go beyond these matrices, and prove an analytical formula for the reconstruction performance of convex generalized linear models with rotationally-invariant data matrices with arbitrary bounded spectrum, rigorously confirming, under suitable assumptions, a con… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 13 publications
(4 citation statements)
references
References 49 publications
0
4
0
Order By: Relevance
“…For ridge regression, there are also precise predictions thanks to random matrix theory [12,[36][37][38][39][40][41]. A related set of results was obtained in [42] for orthogonal random matrix models. The main technical novelty of our proof is the handling of a generic loss and regularisation, not only ridge, representing convex empirical risk minimization, for both classification and regression, with the generic correlation structure of the model (1.1).…”
Section: J Stat Mech (2022) 114001mentioning
confidence: 99%
“…For ridge regression, there are also precise predictions thanks to random matrix theory [12,[36][37][38][39][40][41]. A related set of results was obtained in [42] for orthogonal random matrix models. The main technical novelty of our proof is the handling of a generic loss and regularisation, not only ridge, representing convex empirical risk minimization, for both classification and regression, with the generic correlation structure of the model (1.1).…”
Section: J Stat Mech (2022) 114001mentioning
confidence: 99%
“…entries should be discussed for practical usage. The rotationally invariant matrix is one of the candidates to examine the effectiveness of the nonconvexity control for the general matrix [26][27][28].…”
Section: Summary and Discussionmentioning
confidence: 99%
“…Exact asymptotics for Bayesian estimation in generalized linear models was rigorously established in [11]. On the empirical risk minimization side, exact asymptotics based on different techniques, such as Convex Gaussian min-max theorem [5,16,19,36,38,51,52,62], Random Matrix Theory [43], GAMP [23,39] and first order expansions [14] have been used to study high-dimensional logistic regression and max-margin estimation.…”
Section: Exact Asymptoticsmentioning
confidence: 99%