ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2022
DOI: 10.1109/icassp43922.2022.9746330
|View full text |Cite
|
Sign up to set email alerts
|

Robust and Efficient Uncertainty Aware Biosignal Classification via Early Exit Ensembles

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 11 publications
0
3
0
Order By: Relevance
“…The output of early exits ensemble can also be computed in a multitude of ways, e.g., as arithmetic mean of the predictions [19], [20], via geometric ensemble [11] or through voting strategies [12]. The effectiveness of early exits ensembles is not limited to image classification, as they've also recently been used for image captioning [22], natural language processing [12], for uncertainty quantification and biosignal classification [13], [20] and to improve robustness against adversarial attacks [19]. Moreover, early exits ensembles were employed to produce a teacher-free knowledge distillation technique by treating the aggregated predictions as the teacher predictions [23].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The output of early exits ensemble can also be computed in a multitude of ways, e.g., as arithmetic mean of the predictions [19], [20], via geometric ensemble [11] or through voting strategies [12]. The effectiveness of early exits ensembles is not limited to image classification, as they've also recently been used for image captioning [22], natural language processing [12], for uncertainty quantification and biosignal classification [13], [20] and to improve robustness against adversarial attacks [19]. Moreover, early exits ensembles were employed to produce a teacher-free knowledge distillation technique by treating the aggregated predictions as the teacher predictions [23].…”
Section: Related Workmentioning
confidence: 99%
“…The experiments are conducted on a large set of ConvNets and datasets and both in a traditional training context and through the tuning of pre-trained architectures. Unlike the main reference works in the literature, which aim to reduce latency as much as possible without sacrificing model accuracy or to develop techniques for edge computing [11]- [13], this work aims to quantify and maximise the accuracy gain that the use of early exits' ensemble can provide.…”
Section: Introductionmentioning
confidence: 99%
“…The uncertainty in the environment, referred to as aleatory uncertainty, originates from intrinsic stochasticity and persists even after the environment's model has been learned [4]. While incorporating and quantifying both uncertainties in supervised learning has been explored [5,6], this problem in RL is not yet well-understood [1,2]. Considering uncertainty for decision-making leads to a risk-sensitive policy, where being pessimistic or optimistic about aleatory uncertainty creates a risk-averse or risk-seeking policy, respectively.…”
Section: Introductionmentioning
confidence: 99%