2022
DOI: 10.48550/arxiv.2207.11769
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

CODiT: Conformal Out-of-Distribution Detection in Time-Series Data

Abstract: Machine learning models are prone to making incorrect predictions on inputs that are far from the training distribution. This hinders their deployment in safety-critical applications such as autonomous vehicles and healthcare. The detection of a shift from the training distribution of individual datapoints has gained attention. A number of techniques have been proposed for such out-of-distribution (OOD) detection. But in many applications, the inputs to a machine learning model form a temporal sequence. Existi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 17 publications
0
3
0
Order By: Relevance
“…Although recent researches start to focus on open set recognition (Kaur et al, 2022;Yoshihashi et al, 2019;Geng et al, 2020…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Although recent researches start to focus on open set recognition (Kaur et al, 2022;Yoshihashi et al, 2019;Geng et al, 2020…”
Section: Related Workmentioning
confidence: 99%
“…To have a fair comparison, we also combine out-ofdistribution (OOD) with TCN. That is, any OOD time series that detected by (Kaur et al, 2022) will be assigned as new class (with class index k+1), while all the in-distribution (iD) will be assigned class labels by TCN.…”
Section: Baselinesmentioning
confidence: 99%
See 1 more Smart Citation