2019
DOI: 10.1007/s10994-019-05822-1
|View full text |Cite
|
Sign up to set email alerts
|

Distribution-free uncertainty quantification for kernel methods by gradient perturbations

Abstract: We propose a data-driven approach to quantify the uncertainty of models constructed by kernel methods. Our approach minimizes the needed distributional assumptions, hence, instead of working with, for example, Gaussian processes or exponential families, it only requires knowledge about some mild regularity of the measurement noise, such as it is being symmetric or exchangeable. We show, by building on recent results from finite-sample system identification, that by perturbing the residuals in the gradient of t… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(9 citation statements)
references
References 19 publications
0
9
0
Order By: Relevance
“…F I G U R E 1 WTI observed prices (A) and increments of the corresponding factor (B) computed as in (20).…”
Section: The Garleanu-pedersen Model: Reinforcement Learning Vs Dynam...mentioning
confidence: 99%
See 1 more Smart Citation
“…F I G U R E 1 WTI observed prices (A) and increments of the corresponding factor (B) computed as in (20).…”
Section: The Garleanu-pedersen Model: Reinforcement Learning Vs Dynam...mentioning
confidence: 99%
“…In Figure 1A we display the WTI time series, in Figure 1B the corresponding factor time series computed as in (20). Maximum likelihood estimation of the GP model ( 12)-( 13) returns the parameters in Table 1 5 .…”
Section: Ta B L Ementioning
confidence: 99%
“…Their method yielded well-calibrated predictive uncertainty and accurate predictions for both image classification and regression by exploiting Bayesian model averaging over the induced posterior in the subspaces. Csáji et al [242] introduced a data-driven strategy for uncertainty quantification of models based on kernel techniques. The method needed few mild regularities in the computation of noise instead of distributional assumptions such as dealing with exponential families or GPs.…”
Section: Further Studies Of Uq Methodsmentioning
confidence: 99%
“…Motivated by finite-sample system identification methods [26], [30], [35], we also suggest two specific algorithms that build exact confidence regions for the regression function of binary classification, under mild statistical assumptions. These new methods are based on the concept of resampling, which is closely related to bootstrap and Monte Carlo tests.…”
Section: Resampling Frameworkmentioning
confidence: 99%
“…It is also proved that SPS is strongly consistent [29] under mild statistical assumptions and comparable to the asymptotic confidence ellipsoids. SPS has been already generalized for kernel methods [30] and modified for classification [31], [32]. Nevertheless its analysis for classification is still narrow.…”
Section: Introductionmentioning
confidence: 99%