2012
DOI: 10.1007/s00521-012-0909-2
|View full text |Cite
|
Sign up to set email alerts
|

Boost-wise pre-loaded mixture of experts for classification tasks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 19 publications
0
1
0
Order By: Relevance
“…However, implicit approaches combinations do not benefit from the improved generalization ability of explicitly creating different training sets by probabilistically changing the distribution of the original training data (bagging, boosting). A method combining the explicit boosting approach with implicit ME divide and conquer approach exists [13]. Nonetheless, the low bias distribution change of boosting does not ensure a bias-variance tradeoff.…”
Section: Classificationmentioning
confidence: 99%
“…However, implicit approaches combinations do not benefit from the improved generalization ability of explicitly creating different training sets by probabilistically changing the distribution of the original training data (bagging, boosting). A method combining the explicit boosting approach with implicit ME divide and conquer approach exists [13]. Nonetheless, the low bias distribution change of boosting does not ensure a bias-variance tradeoff.…”
Section: Classificationmentioning
confidence: 99%