2018
DOI: 10.1007/978-3-030-01237-3_34
|View full text |Cite
|
Sign up to set email alerts
|

Out-of-Distribution Detection Using an Ensemble of Self Supervised Leave-Out Classifiers

Abstract: As deep learning methods form a critical part in commercially important applications such as autonomous driving and medical diagnostics, it is important to reliably detect out-of-distribution (OOD) inputs while employing these algorithms. In this work, we propose an OOD detection algorithm which comprises of an ensemble of classifiers. We train each classifier in a self-supervised manner by leaving out a random subset of training data as OOD data and the rest as in-distribution (ID) data. We propose a novel ma… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
150
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 180 publications
(156 citation statements)
references
References 19 publications
(32 reference statements)
0
150
0
Order By: Relevance
“…The results are summarized in Table 2, which shows the comparison of our method, ODIN [16] and Ensemble of Leave-Out Classifiers (ELOC) [26] on various benchmarks. In addition, ELOC [26] does not have results for iSUN as an OOD dataset because they use the whole iSUN as a validation dataset.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…The results are summarized in Table 2, which shows the comparison of our method, ODIN [16] and Ensemble of Leave-Out Classifiers (ELOC) [26] on various benchmarks. In addition, ELOC [26] does not have results for iSUN as an OOD dataset because they use the whole iSUN as a validation dataset.…”
Section: Resultsmentioning
confidence: 99%
“…The current state-of-the-art method for OOD detection is the ensemble of self-supervised leave-out classifiers proposed by Vyas et al [26]. They divided the training ID data into K partitions and assign one partition as OOD and the remaining partitions as ID to train K classifiers by a novel loss function, called margin entropy loss, to increase the prediction confidence of ID samples and decrease the prediction confidence of OOD samples.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…In order to do so, the knowledge of out-of-distributional images (final row), in this case non-dog images, are used to learn a suitable representation. task than out-of-distribution detection [23] since novel object samples are expected to be from a similar distribution to that of known samples.…”
Section: Introductionmentioning
confidence: 99%