2014
DOI: 10.1016/j.asoc.2014.05.033
|View full text |Cite
|
Sign up to set email alerts
|

Efficiency enhancement of a process-based rainfall–runoff model using a new modified AdaBoost.RT technique

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 31 publications
(14 citation statements)
references
References 30 publications
0
14
0
Order By: Relevance
“…Thus, KNN‐AdaBoost which combined K ‐Nearest Neighbor (KNN) and adaptive boosting (AdaBoost) is proposed as a novel algorithm for efficient black tea samples discrimination. Likewise, the number of PCs and the number of iterations ( M ) were the two main parameters that should be optimized for guaranteeing the accuracy of this model (Liu, Xu, Zhao, Xie, & Zhang, ). Herein, 10 values of M (1–50, step 5) and 10 PCs (1–10, step 1) were selected respectively, and the optimal parameters were determined according to the highest discrimination rates.…”
Section: Resultsmentioning
confidence: 99%
“…Thus, KNN‐AdaBoost which combined K ‐Nearest Neighbor (KNN) and adaptive boosting (AdaBoost) is proposed as a novel algorithm for efficient black tea samples discrimination. Likewise, the number of PCs and the number of iterations ( M ) were the two main parameters that should be optimized for guaranteeing the accuracy of this model (Liu, Xu, Zhao, Xie, & Zhang, ). Herein, 10 values of M (1–50, step 5) and 10 PCs (1–10, step 1) were selected respectively, and the optimal parameters were determined according to the highest discrimination rates.…”
Section: Resultsmentioning
confidence: 99%
“…In a later study, the same authors compared the performance of AdaBoosted M5 tree models against ANN models for various applications, including predicting river flows in a catchment; they found higher performance in models that used the AdaBoost.RT algorithm compared to single ANNs (Shrestha and Solomatine, 2006). Liu et al (2014) used AdaBoost.RT for calibrating process-based rainfall-runoff models, and found improved performance over the single model predictions. Wu et al…”
Section: Adaptive Boostingmentioning
confidence: 99%
“…The approach incorporates self-adaptive mechanism supervised by the root mean square error trend in the iteration process to ensure the good prediction performance. Liu et al [28] developed a new modified AdaBoost.RT technique by weighting the wrong predicted sample and loss function, which is proved as an efficient way to enhance the predictive ability of XXT.…”
Section: Introductionmentioning
confidence: 99%