2016
DOI: 10.1016/j.patcog.2015.06.016
|View full text |Cite
|
Sign up to set email alerts
|

A novel combining classifier method based on Variational Inference

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
46
0
2

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
3
1

Relationship

7
2

Authors

Journals

citations
Cited by 48 publications
(48 citation statements)
references
References 29 publications
0
46
0
2
Order By: Relevance
“…Recently, Nguyen et al [28] proposed the Variational Inference (VI) for multivariate Gaussian distribution (VIG) algorithm to approximate = | = > for each class k. The VIG algorithm has been demonstrated to offer superior performance for batch learning under an ensemble framework. In this study, we propose a lossless online version of VIG named OVIG, which not only theoretically converges to its offline counterpart but also achieves the same predictive model regardless of the incremental training order.…”
Section: A Novel Lossless Bayesian Methodsmentioning
confidence: 99%
“…Recently, Nguyen et al [28] proposed the Variational Inference (VI) for multivariate Gaussian distribution (VIG) algorithm to approximate = | = > for each class k. The VIG algorithm has been demonstrated to offer superior performance for batch learning under an ensemble framework. In this study, we propose a lossless online version of VIG named OVIG, which not only theoretically converges to its offline counterpart but also achieves the same predictive model regardless of the incremental training order.…”
Section: A Novel Lossless Bayesian Methodsmentioning
confidence: 99%
“…These datasets are often used to assess the performance of classification systems [48]. Abalone 8 4174 3 Artificial 10 700 2 Australian 14 690 2 Blood 4 748 2 Bupa 6 345 2 Contraceptive 9 1473 3 Dermatology 34 358 6 Fertility 9 100 2 Haberman 3 306 2 Heart 13 270 2 Penbased 16 10992 10 Pima 8 768 2 Plant Margin 64 1600 100 Satimage 36 6435 6 Skin_NonSkin 3 245057 2 Tae 20 151 3 Texture 40 5500 10 Twonorm 20 7400 2 Vehicle 18 946 4 Vertebral 6 310 3 Yeast 8 1484 10 We performed extensive comparative studies with a number of existing algorithms as Tree with maximum of 200 iterations as in [19]), Bagging [21], and Random Subspace [23] (we used 200 learners as in [19]).…”
Section: Datasets and Experimental Settingsmentioning
confidence: 99%
“…Ensemble methods have been shown to achieve higher classification accuracy than single classifier systems and have been applied to many applications such as object detection and tracking, computer-aided medical diagnosis, and intrusion detection [48]. In general, ensemble methods can be categorized into two types [27]:…”
Section: Introductionmentioning
confidence: 99%