2020
DOI: 10.48550/arxiv.2007.05134
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Revisiting One-vs-All Classifiers for Predictive Uncertainty and Out-of-Distribution Detection in Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
27
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(30 citation statements)
references
References 0 publications
2
27
0
Order By: Relevance
“…Classification methods For the classification objective function L cls , we use a softmax classifier with a cross-entropy loss (CE) or a One-Vs-All classifier with Distance Maximization loss (OVADM) [18]. We use the OVADM loss and the CE loss to analyze the effect of distance-based loss in OOD detection.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Classification methods For the classification objective function L cls , we use a softmax classifier with a cross-entropy loss (CE) or a One-Vs-All classifier with Distance Maximization loss (OVADM) [18]. We use the OVADM loss and the CE loss to analyze the effect of distance-based loss in OOD detection.…”
Section: Methodsmentioning
confidence: 99%
“…An out-of-distribution (OOD) detection task is to recognize outliers or anomalies, which do not follow the distribution of the training data. To address this problem, numerous methods have been proposed in classification tasks in several modalities, including computer vision [8,16,17,9,23], natural language processing [12,21], and two-dimensional real-number dataset (e.g., Gaussian noise distribution) [19,16,18,24]. Especially, [8,16,17] are simple yet efficient algorithms since they use existing trained models for OOD detection without additional fine-tuning with OOD samples.…”
Section: Introductionmentioning
confidence: 99%
“…The task assumes plenty of labeled inliers in training data. Padhy et al employ OVA-classifiers for this task [30]. Hendrycks et al reveal that exposing a model to outlier data allows it to effectively detect anomalies [18] and train it on out-of-distribution training data in a supervised way.…”
Section: Related Workmentioning
confidence: 99%
“…To find similar metrics of OOD we explored general OOD literature for models which we can adapt for image segmentation. Variations of classical one-versus-all models have been adapted to neural networks Padhy et al (2020); Franchi et al (2020). The closest work that we could find to our proposed approach uses a deep network as a feature extractor for an RBF network van Amersfoort et al (2020).…”
Section: Related Workmentioning
confidence: 99%
“…In this section we evaluate our method alongside recent OOD models van Amersfoort et al (2020); Franchi et al (2020); Padhy et al (2020), assessing their capabilities to reach segmentation performance comparable to well-established deterministic models and whether they can accurately detect outliers.…”
Section: Evaluation On Brain Mrimentioning
confidence: 99%