2022
DOI: 10.48550/arxiv.2204.06507
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Out-of-Distribution Detection with Deep Nearest Neighbors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(10 citation statements)
references
References 0 publications
1
9
0
Order By: Relevance
“…Under distributional assumptions free hypothesis, ODD needs the right tuning of some parameters [8]. Our solution still maintains the former and does not rely on any critical parameter setting.…”
Section: Related Workmentioning
confidence: 88%
See 3 more Smart Citations
“…Under distributional assumptions free hypothesis, ODD needs the right tuning of some parameters [8]. Our solution still maintains the former and does not rely on any critical parameter setting.…”
Section: Related Workmentioning
confidence: 88%
“…If the data at runtime generates a histogram "significantly different" from the training one, it means that the data are OoD. Unlike K-NN [8] and Neural Networks distance [9], where a single distance criterion is defined, the similarity measure can be derived through multiple metrics. This offers support to the tests mentioned by EASA, since the proposed method measures incremental cases of departure from in-distribution.…”
Section: A Contributionmentioning
confidence: 99%
See 2 more Smart Citations
“…The work in [97] measures the Mahalanobis distance to class centroids to detect OOD. The study in [98] uses the nearest-neighbor distance for OOD detection. Note that the approach in this chapter is complementary to the latest OOD detection methods.…”
Section: Related Workmentioning
confidence: 99%