2009
DOI: 10.3724/sp.j.1016.2009.00336
|View full text |Cite
|
Sign up to set email alerts
|

Fast Adaboost Training Algorithm by Dynamic Weight Trimming

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
6
0

Year Published

2010
2010
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 9 publications
0
6
0
Order By: Relevance
“…But, it is very difficult to give the upper bound of weighted error on weak learning in advance. So application of Boosting was limited [15] . In this paper we propose an evolutionary extreme learning machine based on dynamic Adaboost ensemble in order to overcome mentioned faults.…”
Section: Evolutionary Extreme Learning Machine Based On Dynamic Adabomentioning
confidence: 99%
See 2 more Smart Citations
“…But, it is very difficult to give the upper bound of weighted error on weak learning in advance. So application of Boosting was limited [15] . In this paper we propose an evolutionary extreme learning machine based on dynamic Adaboost ensemble in order to overcome mentioned faults.…”
Section: Evolutionary Extreme Learning Machine Based On Dynamic Adabomentioning
confidence: 99%
“…Fuzzy activation function provides some advantages such as low computational burden and easy implementation in hardware. And dynamic Adaboost ensemble [15] is new ensemble learning algorithm based on Adaboost [18] and SWTAdaboost [19] . The ensemble learning algorithm is called Dynamic Weight Trimming Adaboost (DWTAdaboost).…”
Section: Evolutionary Extreme Learning Machine Based On Dynamic Adabomentioning
confidence: 99%
See 1 more Smart Citation
“…However, due ired by MI, the training of Gentleboost even would beco nd the idea of dynamical weight trimming (DWT) [18] for this problem. This implementation is still called DW samples are filtered according to a threshold whic weight distribution over training data at the correspond with weight less than the threshold are not used for train s not trained on the whole training samples and is selec ed error of selected classifier may even be more than or eq non-effective classifiers, is dynamically chan on error.…”
Section: Mutual Information and Dynamical Weight Trimmingmentioning
confidence: 99%
“…As shown in Table 1, the accuracy of Gentleboost with DWT was only slightly lower than the original Gentleboost. The larger value of can bring faster training speed, but it will also introduce more errors [18]. A trade-off between speed and accuracy must be considered.…”
Section: Mutual Information and Dynamical Weight Trimmingmentioning
confidence: 99%