2016 IEEE 16th International Conference on Data Mining (ICDM) 2016
DOI: 10.1109/icdm.2016.0150
|View full text |Cite
|
Sign up to set email alerts
|

Background Check: A General Technique to Build More Reliable and Versatile Classifiers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 12 publications
(8 citation statements)
references
References 16 publications
0
7
0
Order By: Relevance
“…After all, if a prediction is of poor quality, it would be unreasonable to expect the explanation to be sensible. We suggest that an explanation be accompanied by a list of uncertainty sources, one of which may be the confidence of the predictive model for the instance being explained [39]. For example, if a method relies on synthetic data (as opposed to real data) this should be clearly stated as a source of variability, hence uncertainty.…”
Section: S4 Explanation Alitymentioning
confidence: 99%
“…After all, if a prediction is of poor quality, it would be unreasonable to expect the explanation to be sensible. We suggest that an explanation be accompanied by a list of uncertainty sources, one of which may be the confidence of the predictive model for the instance being explained [39]. For example, if a method relies on synthetic data (as opposed to real data) this should be clearly stated as a source of variability, hence uncertainty.…”
Section: S4 Explanation Alitymentioning
confidence: 99%
“…As a third method, we use the Background Check technique proposed by Perello-Nieto e.a. [22]. This is an integrated methodology based on an explicit background class b, that they use to update the posterior distribution of the classifier as well as to detect ambiguous and novel cases.…”
Section: Background Check Methodsmentioning
confidence: 99%
“…Perhaps most related to our setting is Background Check [22] and we will therefore use it in our experiments. It concerns a technique that does not depend on a specific type of base classifier.…”
Section: Related Workmentioning
confidence: 99%
“…Hence, the standard approach is to calibrate the transformation such that γ percent of the examples have a probability > 0.5. Beyond the logistic calibration approach, there is a long literature of approaches [5,7,8,10,11,17] for ensuring that this property is obtained and we now briefly describe some prominent approaches. Isotonic Calibration [20] is a non-parametric form of regression in which the transformation function is chosen from the class of all non-decreasing functions.…”
Section: From Anomaly Scores To Outlier Probabilitiesmentioning
confidence: 99%