2019
DOI: 10.1007/978-3-030-20485-3_17
|View full text |Cite
|
Sign up to set email alerts
|

A Framework to Monitor Machine Learning Systems Using Concept Drift Detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 13 publications
0
0
0
Order By: Relevance
“…A variety of methods have been devised to ascertain whether two sets of samples are derived from identical distributions, including statistical hypothesis tests such as the Kolmogorov-Smirnov (KS) test [99,150] and Kullback-Leibler (KL) divergence [151]. Yet, these approaches frequently require precise tuning, including the selection of appropriate kernels and hyperparameters.…”
Section: Monitoringmentioning
confidence: 99%
“…A variety of methods have been devised to ascertain whether two sets of samples are derived from identical distributions, including statistical hypothesis tests such as the Kolmogorov-Smirnov (KS) test [99,150] and Kullback-Leibler (KL) divergence [151]. Yet, these approaches frequently require precise tuning, including the selection of appropriate kernels and hyperparameters.…”
Section: Monitoringmentioning
confidence: 99%