2022
DOI: 10.1109/access.2021.3138978
|View full text |Cite
|
Sign up to set email alerts
|

One Versus All for Deep Neural Network for Uncertainty (OVNNI) Quantification

Abstract: Deep neural networks (DNNs) are powerful learning models yet their results are not always reliable. This drawback results from the fact that modern DNNs are usually overconfident, and consequently their epistemic uncertainty cannot be straightforwardly characterized. In this work, we propose a new technique to quantify the epistemic uncertainty of data easily. This method consists in mixing the predictions of an ensemble of DNNs trained to classify One class vs. All the other classes (OVA) with predictions fro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 59 publications
0
5
0
Order By: Relevance
“…6 is much smaller, which results in the evener distributed labels, and we consider this is the main cause of its worse performance. Yang et al [9] based model outperforms the others, which indicates that the Multi-BCE loss is more suitable for uncertainty estimation, which is similar to the one-versus-all strategy [29].…”
Section: Methodsmentioning
confidence: 93%
“…6 is much smaller, which results in the evener distributed labels, and we consider this is the main cause of its worse performance. Yang et al [9] based model outperforms the others, which indicates that the Multi-BCE loss is more suitable for uncertainty estimation, which is similar to the one-versus-all strategy [29].…”
Section: Methodsmentioning
confidence: 93%
“…6 is much smaller, which results in the evener distributed labels, and we consider this is the main cause of its worse performance. Yang et al [9] based model outperforms the others, which indicates that the Multi-BCE loss is more suitable for uncertainty estimation, which is similar to the one-versus-all strategy [29].…”
Section: Performance and Discussionmentioning
confidence: 99%
“…One-vs-All methods in the context of OoD detection have been recently studied by Franchi et al [2020], Padhy et al [2020], Saito and Saenko [2021]. In the work by Franchi et al [2020] an ensemble of binary neural networks is trained to perform one-vs-all classification on the in-distribution data which are then weighted by a standard softmax classifier.…”
Section: Related Workmentioning
confidence: 99%
“…One-vs-All methods in the context of OoD detection have been recently studied by Franchi et al [2020], Padhy et al [2020], Saito and Saenko [2021]. In the work by Franchi et al [2020] an ensemble of binary neural networks is trained to perform one-vs-all classification on the in-distribution data which are then weighted by a standard softmax classifier. Padhy et al [2020] use a DNN with a single sigmoid binary output for every class and explore the possibility of training the one-vs-all network with a distance based loss function instead of the binary cross entropy.…”
Section: Related Workmentioning
confidence: 99%