2019
DOI: 10.1007/s10489-019-01421-8
|View full text |Cite
|
Sign up to set email alerts
|

The MBPEP: a deep ensemble pruning algorithm providing high quality uncertainty prediction

Abstract: Machine learning algorithms have been effectively applied into various real world tasks. However, it is difficult to provide high-quality machine learning solutions to accommodate an unknown distribution of input datasets; this difficulty is called the uncertainty prediction problems. In this paper, a margin-based Pareto deep ensemble pruning (MBPEP) model is proposed. It achieves the high-quality uncertainty estimation with a small value of the prediction interval width (MPIW) and a high confidence of predict… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
2

Relationship

1
8

Authors

Journals

citations
Cited by 23 publications
(13 citation statements)
references
References 28 publications
0
13
0
Order By: Relevance
“…Liu et al [194] presented a Bayesian nonparametric ensemble method which enhanced an ensemble model that augmented model's distribution functions using Bayesian nonparametric machinery and prediction mechanism. Hu et al [195] proposed a model called margin-based Pareto deep ensemble pruning utilizing deep ensemble network that yielded competitive uncertainty estimation with elevated confidence of prediction interval coverage probability and a small value of the prediction interval width. In another study, the researchers [196] exploited the challenges associated with attaining uncertainty estimations for structured predictions job and presented baselines for sequence-level out-of-domain input detection, sequence-level prediction rejection and token-level error detection utilizing ensembles.…”
Section: Ensemble Techniquesmentioning
confidence: 99%
See 1 more Smart Citation
“…Liu et al [194] presented a Bayesian nonparametric ensemble method which enhanced an ensemble model that augmented model's distribution functions using Bayesian nonparametric machinery and prediction mechanism. Hu et al [195] proposed a model called margin-based Pareto deep ensemble pruning utilizing deep ensemble network that yielded competitive uncertainty estimation with elevated confidence of prediction interval coverage probability and a small value of the prediction interval width. In another study, the researchers [196] exploited the challenges associated with attaining uncertainty estimations for structured predictions job and presented baselines for sequence-level out-of-domain input detection, sequence-level prediction rejection and token-level error detection utilizing ensembles.…”
Section: Ensemble Techniquesmentioning
confidence: 99%
“…Deep ensemble, is another powerful method used to measure uncertainty and has been extensively applied in many real-world applications [195]. To achieve good learning results, the data distributions in testing datasets should be as close as the training datasets.…”
Section: Deep Ensemblementioning
confidence: 99%
“…Uncertainty quantification (UQ) is an important end-goal in several real-world scientific applications where it is vital to produce distributions of output predictions as opposed to point estimates, allowing for meaningful analyses of the confidence in our predictions. In the context of deep learning, a number of techniques have been developed for UQ, including the use of Bayesian approximations [8,18,19] and ensemble-based methods [6,13,14,24]. A simple approach for performing UQ given a trained DL model is to apply Monte Carlo (MC)-Dropout on the DL weights during testing [4].…”
Section: Background and Related Work 21 Uncertainty Quantification With DLmentioning
confidence: 99%
“…CNN-based counting. With the development of deep learning [11][12][13][14], the CNN-based methods [15][16][17][18] can transform the highly congested images into the densityestimation problem. The point-level labeling is adopted to annotate the objects, and the density map is generated following the Gaussian filter [17].…”
Section: Related Workmentioning
confidence: 99%