2019 IEEE International Symposium on Information Theory (ISIT) 2019
DOI: 10.1109/isit.2019.8849398
|View full text |Cite
|
Sign up to set email alerts
|

A New Look at an Old Problem: A Universal Learning Approach to Linear Regression

Abstract: Linear regression is a classical paradigm in statistics. A new look at it is provided via the lens of universal learning. In applying universal learning to linear regression the hypotheses class represents the label y ∈ R as a linear combination of the feature vector x T θ where x ∈ R M , within a Gaussian error. The Predictive Normalized Maximum Likelihood (pNML) solution for universal learning of individual data can be expressed analytically in this case, as well as its associated learnability measure. Inter… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
14
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 21 publications
(17 citation statements)
references
References 12 publications
1
14
0
Order By: Relevance
“…Our results, together with other work in the last year [6][7][8][9], provide a significant understanding of the ramifications of selecting interpolating solutions of noisy data generated from a high-dimensional, or overparameterized linear model, i.e. where the number of features used far exceeds the number of samples of training data.…”
Section: Conclusion For High-dimensional Generative Modelmentioning
confidence: 60%
See 2 more Smart Citations
“…Our results, together with other work in the last year [6][7][8][9], provide a significant understanding of the ramifications of selecting interpolating solutions of noisy data generated from a high-dimensional, or overparameterized linear model, i.e. where the number of features used far exceeds the number of samples of training data.…”
Section: Conclusion For High-dimensional Generative Modelmentioning
confidence: 60%
“…An earlier edition of our work was presented at Information Theory and Applications, February 2019 and subsequently accepted to IEEE International Symposium on Information Theory, July 2019. Several elegant and interesting papers [6][7][8][9][10]46] have appeared around this time. All of these center around the analysis of the 2 -minimizing interpolator.…”
Section: Concurrent Work In High-dimensional Linear Regressionmentioning
confidence: 99%
See 1 more Smart Citation
“…Several methods deal with obtaining the pNML learner for different hypothesis sets. Bibas et al (2019b) and Bibas and Feder (2021) showed the pNML solution for linear regression. Rosas et al (2020) proposed an NML based decision strategy for supervised classification problems and showed it attains heuristic PAC learning.…”
Section: Dnn Adaptationmentioning
confidence: 99%
“…Follow this procedure for every label and normalize to get a valid probability assignment. The pNML was developed before for linear regression (Bibas et al, 2019b) and was evaluated empirically for DNN (Fu and Levine, 2021;Bibas et al, 2019a).…”
Section: Introductionmentioning
confidence: 99%