2020
DOI: 10.48550/arxiv.2007.05566
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Contrastive Training for Improved Out-of-Distribution Detection

Abstract: Reliable detection of out-of-distribution (OOD) inputs is increasingly understood to be a precondition for deployment of machine learning systems. This paper proposes and investigates the use of contrastive training to boost OOD detection performance. Unlike leading methods for OOD detection, our approach does not require access to examples labeled explicitly as OOD, which can be difficult to collect in practice. We show in extensive experiments that contrastive training significantly helps OOD detection perfo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
95
1

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 50 publications
(100 citation statements)
references
References 15 publications
1
95
1
Order By: Relevance
“…5.2, we use the Expected Calibration Error (ECE) (Guo et al, 2017) with 15 bins and Negative Log Likelihood (NLL) which is a strictly proper scoring rule (Gneiting & Raftery, 2007). Datasets: CIFAR10 (Krizhevsky et al, a) and CI-FAR100 (Krizhevsky et al, b) are considered near OOD datasets (Winkens et al, 2020;Fort et al, 2021) to each other. SVHN (Netzer et al, 2011) is considered a far OOD dataset to both CIFAR10 and CIFAR100 due to its shift in both concept and style.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…5.2, we use the Expected Calibration Error (ECE) (Guo et al, 2017) with 15 bins and Negative Log Likelihood (NLL) which is a strictly proper scoring rule (Gneiting & Raftery, 2007). Datasets: CIFAR10 (Krizhevsky et al, a) and CI-FAR100 (Krizhevsky et al, b) are considered near OOD datasets (Winkens et al, 2020;Fort et al, 2021) to each other. SVHN (Netzer et al, 2011) is considered a far OOD dataset to both CIFAR10 and CIFAR100 due to its shift in both concept and style.…”
Section: Methodsmentioning
confidence: 99%
“…Generalized ODIN Hsu et al (2020) (also Techapanurak et al (2019)) includes an additional network in the last layer to improve OOD detection during training. There are many other interesting OOD detection approaches that have achieved state-of-the-art performance without OOD data such as using contrastive learning with various transformations (Winkens et al, 2020;Tack et al, 2020), training a deep ensemble of multiple models (Lakshminarayanan et al, 2016) and leveraging large pretrained models (Fort et al, 2021). They require extended training time, hyperparameter tuning and careful selections of transformations, whereas our method does not introduce any hyperparameters and has negligible influence on standard cross-entropy training time.…”
Section: Extended Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Due to the recent use of self-supervised learning along with contrastive loss in several applications, attempts have been made to use contrastive learning to detect OoD samples. Winkens et al, [37] uses a famous method SIMCLR [10] for contrastive training. It treated different augmentations as positive and every other image as negative to learn semantically rich features.…”
Section: Related Workmentioning
confidence: 99%