2020
DOI: 10.1109/access.2020.3006711
|View full text |Cite
|
Sign up to set email alerts
|

Building Uncertainty Models on Top of Black-Box Predictive APIs

Abstract: With the commoditization of machine learning, more and more off-the-shelf models are available as part of code libraries or cloud services. Typically, data scientists and other users apply these models as "black boxes" within larger projects. In the case of regressing a scalar quantity, such APIs typically offer a predict() function, which outputs the estimated target variable (often referred to aŝ y or, in code, y_hat). However, many real-world problems may require some sort of deviation interval or uncertain… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 22 publications
0
3
0
Order By: Relevance
“…However, in this approach, risk measures whose parameter spaces are not shared cannot be addressed simultaneously. In our work, this limitation is addressed by introducing an encoder for risk measures in WV@R. Crossing Quantile Problem: The crossing quantile problem is a significant concern in both statistics (Bondell, Reich, and Wang 2010;Dette and Volgushev 2008) and machine learning areas (Brando et al 2022;Zhou, Wang, and Feng 2020). In the field of statistics, Brando et al (2022) tackled the problem by exploring the numerical integration over a non-negative neural network, similar to UCMNN (Wehenkel and Louppe 2019).…”
Section: Related Workmentioning
confidence: 99%
“…However, in this approach, risk measures whose parameter spaces are not shared cannot be addressed simultaneously. In our work, this limitation is addressed by introducing an encoder for risk measures in WV@R. Crossing Quantile Problem: The crossing quantile problem is a significant concern in both statistics (Bondell, Reich, and Wang 2010;Dette and Volgushev 2008) and machine learning areas (Brando et al 2022;Zhou, Wang, and Feng 2020). In the field of statistics, Brando et al (2022) tackled the problem by exploring the numerical integration over a non-negative neural network, similar to UCMNN (Wehenkel and Louppe 2019).…”
Section: Related Workmentioning
confidence: 99%
“…For example, the authors in [23] proposed simultaneous quantile regression as a method of estimating quantiles by minimizing the pinball loss, whereas the target quantile is randomly sampled in every training iteration. The algorithm presented in [24] is designed to predict an arbitrary number of quantiles, which can maintain quantile monotonicity by restricting the partial derivatives of the quantile functions. The use of these approaches may alleviate the problem of quantile-crossing; however, they cannot eliminate it entirely.…”
Section: A Prior Workmentioning
confidence: 99%
“…calibration of neural networks using Platt scaling, 52 it was not until recently that such a method is directly used to upgrade any black-box predictive API with an uncertainty score. 53 However, their wrapper is based on deep neural networks and hence is not interpretable. PIM is not a parametric model, but uses a globally interpretable neural network with a single unit.…”
Section: Related Workmentioning
confidence: 99%